You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by Swati Ramteke <Sw...@persistent.co.in> on 2012/10/12 17:09:46 UTC

Not able to write data from source log file to sink [HDFS]

Hi,

While running flume agent I am getting following error:



hduser@vm-ps7274:/home/hadoop/Downloads/apache-flume-1.2.0$ bin/flume-ng agent -c conf  -f conf/flumeHdfs.conf -Dflume.root.logger=DEBUGG,console  -n agent1

Info: Sourcing environment configuration script /home/hadoop/Downloads/apache-flume-1.2.0/conf/flume-env.sh

+ exec /opt/java/java/jdk1.6.0_32/bin/java -Xmx20m -Dflume.root.logger=DEBUGG,console -cp '/home/hadoop/Downloads/apache-flume-1.2.0/conf:/root/f1/apache-flume-1.2.0//lib/*' -Djava.library.path= org.apache.flume.node.Application -f conf/flumeHdfs.conf -n agent1

bin/flume-ng: line 210: /opt/java/java/jdk1.6.0_32/bin/java: cannot execute binary file

bin/flume-ng: line 210: /opt/java/java/jdk1.6.0_32/bin/java: Success






Please find below FlumeHdfs.conf :

##Agent to copy the log from source to HDFS sink

# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory



# Define an EXEC source called src on agent1 and connect it to channel ch1.
agent1.sources.src.channels = ch1
agent1.sources.src.type = exec
agent1.sources.src.command = tail -F /home/hadoop/Downloads/apache-flume-1.2.0/h.txt


# Define a HDFS sink and connect it to the other end of the same channel.
agent1.sinks.HDFS.channel = ch1
agent1.sinks.HDFS.type = hdfs
agent1.sinks.HDFS.hdfs.path = hdfs://localhost:54310/user/hduser
agent1.sinks.HDFS.hdfs.fileType = DataStream
agent1.sinks.HDFS.hdfs.writeFormat = Text
agent1.sinks.HDFS.hdfs.filePrefix = FlumeTest



# Finally, now that we've defined all of our components, tell
# agent1 which ones we want to activate.
agent1.channels = ch1
agent1.sources = src
agent1.sinks = HDFS


And flume-env.sh

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# If this file is placed at FLUME_CONF_DIR/flume-env.sh, it will be sourced
# during Flume startup.

# Enviroment variables can be set here.

#JAVA_HOME=/usr/lib/jvm/java-6-sun

# Give Flume more memory and pre-allocate, enable remote monitoring via JMX
#JAVA_OPTS="-Xms100m -Xmx200m -Dcom.sun.management.jmxremote"

# Note that the Flume conf directory is always included in the classpath.
#FLUME_CLASSPATH=""

export FLUME_HOME=/root/f1/apache-flume-1.2.0/

export JAVA_HOME=/opt/java/java/jdk1.6.0_32


export FLUME_CONF_DIR=/root/f1/apache-flume-1.2.0/conf/
#export HADOOP_HOME=/hduser/home/hadoop/Downloads/hadoop_files/bin


export PATH=$JAVA_HOME/bin:$FLUME_HOME/bin:$PATH



As per my understanding I think this issue is related to java path is not set correctly. Please advise me if I am wrong.

Thanks & Regards,
Swati

DISCLAIMER
==========
This e-mail may contain privileged and confidential information which is the property of Persistent Systems Ltd. It is intended only for the use of the individual or entity to which it is addressed. If you are not the intended recipient, you are not authorized to read, retain, copy, print, distribute or use this message. If you have received this communication in error, please notify the sender and delete all copies of this message. Persistent Systems Ltd. does not accept any liability for virus infected mails.

Re: Not able to write data from source log file to sink [HDFS]

Posted by Brock Noland <br...@cloudera.com>.
What happens when you run just:

/opt/java/java/jdk1.6.0_32/bin/java

Does it actually execute the java command?

Brock

On Fri, Oct 12, 2012 at 10:09 AM, Swati Ramteke
<Sw...@persistent.co.in> wrote:
> Hi,
>
>
>
> While running flume agent I am getting following error:
>
>
>
> hduser@vm-ps7274:/home/hadoop/Downloads/apache-flume-1.2.0$ bin/flume-ng
> agent -c conf  -f conf/flumeHdfs.conf -Dflume.root.logger=DEBUGG,console  -n
> agent1
>
> Info: Sourcing environment configuration script
> /home/hadoop/Downloads/apache-flume-1.2.0/conf/flume-env.sh
>
> + exec /opt/java/java/jdk1.6.0_32/bin/java -Xmx20m
> -Dflume.root.logger=DEBUGG,console -cp
> '/home/hadoop/Downloads/apache-flume-1.2.0/conf:/root/f1/apache-flume-1.2.0//lib/*'
> -Djava.library.path= org.apache.flume.node.Application -f
> conf/flumeHdfs.conf -n agent1
>
> bin/flume-ng: line 210: /opt/java/java/jdk1.6.0_32/bin/java: cannot execute
> binary file
>
> bin/flume-ng: line 210: /opt/java/java/jdk1.6.0_32/bin/java: Success
>
>
>
>
>
>
>
> Please find below FlumeHdfs.conf :
>
>
>
> ##Agent to copy the log from source to HDFS sink
>
>
>
> # Define a memory channel called ch1 on agent1
>
> agent1.channels.ch1.type = memory
>
>
>
>
>
>
>
> # Define an EXEC source called src on agent1 and connect it to channel ch1.
>
> agent1.sources.src.channels = ch1
>
> agent1.sources.src.type = exec
>
> agent1.sources.src.command = tail -F
> /home/hadoop/Downloads/apache-flume-1.2.0/h.txt
>
>
>
>
>
> # Define a HDFS sink and connect it to the other end of the same channel.
>
> agent1.sinks.HDFS.channel = ch1
>
> agent1.sinks.HDFS.type = hdfs
>
> agent1.sinks.HDFS.hdfs.path = hdfs://localhost:54310/user/hduser
>
> agent1.sinks.HDFS.hdfs.fileType = DataStream
>
> agent1.sinks.HDFS.hdfs.writeFormat = Text
>
> agent1.sinks.HDFS.hdfs.filePrefix = FlumeTest
>
>
>
>
>
>
>
> # Finally, now that we've defined all of our components, tell
>
> # agent1 which ones we want to activate.
>
> agent1.channels = ch1
>
> agent1.sources = src
>
> agent1.sinks = HDFS
>
>
>
>
>
> And flume-env.sh
>
>
>
> # Licensed to the Apache Software Foundation (ASF) under one
>
> # or more contributor license agreements.  See the NOTICE file
>
> # distributed with this work for additional information
>
> # regarding copyright ownership.  The ASF licenses this file
>
> # to you under the Apache License, Version 2.0 (the
>
> # "License"); you may not use this file except in compliance
>
> # with the License.  You may obtain a copy of the License at
>
> #
>
> #     http://www.apache.org/licenses/LICENSE-2.0
>
> #
>
> # Unless required by applicable law or agreed to in writing, software
>
> # distributed under the License is distributed on an "AS IS" BASIS,
>
> # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>
> # See the License for the specific language governing permissions and
>
> # limitations under the License.
>
>
>
> # If this file is placed at FLUME_CONF_DIR/flume-env.sh, it will be sourced
>
> # during Flume startup.
>
>
>
> # Enviroment variables can be set here.
>
>
>
> #JAVA_HOME=/usr/lib/jvm/java-6-sun
>
>
>
> # Give Flume more memory and pre-allocate, enable remote monitoring via JMX
>
> #JAVA_OPTS="-Xms100m -Xmx200m -Dcom.sun.management.jmxremote"
>
>
>
> # Note that the Flume conf directory is always included in the classpath.
>
> #FLUME_CLASSPATH=""
>
>
>
> export FLUME_HOME=/root/f1/apache-flume-1.2.0/
>
>
>
> export JAVA_HOME=/opt/java/java/jdk1.6.0_32
>
>
>
>
>
> export FLUME_CONF_DIR=/root/f1/apache-flume-1.2.0/conf/
>
> #export HADOOP_HOME=/hduser/home/hadoop/Downloads/hadoop_files/bin
>
>
>
>
>
> export PATH=$JAVA_HOME/bin:$FLUME_HOME/bin:$PATH
>
>
>
>
>
>
>
> As per my understanding I think this issue is related to java path is not
> set correctly. Please advise me if I am wrong.
>
>
>
> Thanks & Regards,
>
> Swati
>
> DISCLAIMER ========== This e-mail may contain privileged and confidential
> information which is the property of Persistent Systems Ltd. It is intended
> only for the use of the individual or entity to which it is addressed. If
> you are not the intended recipient, you are not authorized to read, retain,
> copy, print, distribute or use this message. If you have received this
> communication in error, please notify the sender and delete all copies of
> this message. Persistent Systems Ltd. does not accept any liability for
> virus infected mails.



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/