You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by vijay k <k....@gmail.com> on 2012/07/02 09:11:46 UTC

Re: Flume agent failure

Hi Mike,

Please find the below flume-ng script execution output.

root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
Info: Sourcing environment configuration script
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
for HDFS access
+ exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m -Xmx200m -cp
'/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
-Djava.library.path=:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
org.apache.flume.node.Application -n agent1 -f conf/agent1.conf


Flume.log
==========

root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
more flume.log
2012-07-02 12:37:40,326 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 1
2012-07-02 12:37:40,327 INFO node.FlumeNode: Flume node starting - agent1
2012-07-02 12:37:40,329 INFO nodemanager.DefaultLogicalNodeManager: Node
manager starting
2012-07-02 12:37:40,329 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 10
2012-07-02 12:37:40,329 INFO
properties.PropertiesFileConfigurationProvider: Configuration provider
starting
2012-07-02 12:37:40,330 INFO
properties.PropertiesFileConfigurationProvider: Reloading configuration
file:conf/agent1.conf
2012-07-02 12:37:40,337 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-02 12:37:40,338 INFO conf.FlumeConfiguration: Added sinks: HDFS
Agent: agent1
2012-07-02 12:37:40,354 INFO conf.FlumeConfiguration: Post-validation flume
configuration contains configuration  for agents: [agent1]
2012-07-02 12:37:40,354 INFO
properties.PropertiesFileConfigurationProvider: Creating channels
2012-07-02 12:37:40,357 INFO
properties.PropertiesFileConfigurationProvider: created channel
MemoryChannel-2
2012-07-02 12:37:40,365 INFO sink.DefaultSinkFactory: Creating instance of
sink HDFS typehdfs
2012-07-02 12:37:40,369 ERROR
properties.PropertiesFileConfigurationProvider: Failed to start agent
because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError:
org/apache/hadoop/io/SequenceFile$CompressionType
        at
org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:204)
        at
org.apache.flume.conf.Configurables.configure(Configurables.java:41)
        at
org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:373)
        at
org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
        at
org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
        at
org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
        at
org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.io.SequenceFile$CompressionType
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Please help me on this issue.

Thanks,
vijay
On Sat, Jun 30, 2012 at 2:46 AM, Mike Percy <mp...@cloudera.com> wrote:

> Vijay,
> Can you please post the output from flume-ng script when you start it now?
>
> This will be useful info for debugging:
>
> + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp … etc …
>
> Regards,
> Mike
>
>
> On Friday, June 29, 2012 at 1:00 AM, vijay k wrote:
>
> > Hi,
> >
> > Thanks for the reply.
> >
> > I have installed hadoop in /usr/local/hadoop and added the below
> > variable in the flume-env.sh (http://flume-env.sh) file, and re-run the
> bin/flume-ng agent
> > -n agent1 -c conf -f conf/agent1.conf,
> > but still i am facing the same error.
> >
> > flume-env.sh (http://flume-env.sh)
> > ============
> > # Enviroment variables can be set here.
> >
> > #JAVA_HOME=/usr/lib/jvm/java-6-sun
> > JAVA_HOME=/usr/lib/jvm/java-6-sun-1.6.0.26/jre
> > # Give Flume more memory and pre-allocate
> > #JAVA_OPTS="-Xms100m -Xmx200m"
> >
> > # Note that the Flume conf directory is always included in the classpath.
> >
> FLUME_CLASSPATH=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf
> > HADOOP_HOME=/usr/local/hadoop
> >
> >
> > Error in flume.log file
> > =======================
> > 2012-06-29 13:19:23,679 ERROR
> > properties.PropertiesFileConfigurationProvider: Failed to start agent
> > because dependencies were not found in classpath. Error f
> > ollows.
> >
> >
> >
> > Please help me, if i am doing anything wrong?
> >
> > Thanks,
> > Vijay
> >
> >
> > On 6/29/12, Mike Percy <mpercy@cloudera.com (mailto:mpercy@cloudera.com)>
> wrote:
> > > Vijay - Flume does not include the HDFS libraries. This is because
> every
> > > major version of HDFS is wire-incompatible with all the others. So you
> will
> > > need to install Hadoop, set your HADOOP_HOME variable to point to the
> Hadoop
> > > installation (define this in flume-env.sh (http://flume-env.sh)),
> then restart Flume and you
>  > > should be good to go.
> > >
> > > Regards,
> > > Mike
> > >
> > >
> > > On Thursday, June 28, 2012 at 2:02 AM, vijay k wrote:
> > >
> > > > Hi,
> > > >
> > > > I have removed the
> > > >
> 'log4j.appender.LOGFILE=org.apache.flume.lifecycle.LifecycleSupervisor'
> > > > in the log4j.properties file,
> > > >
> > > > and i run the bin/flume-ng agent -n agent1 -c conf -f
> conf/agent1.conf
> > > > command, i got stuck on the execution like below.
> > > >
> > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> > > > bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
> > > > Info: Sourcing environment configuration script
> > > >
> /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
> > > > + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
> > > >
> '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*'
> > > > -Djava.library.path= org.apache.flume.node.Application -n agent1 -f
> > > > conf/agent1.conf
> > > >
> > > >
> > > >
> > > > Flume.log file
> > > > ====================
> > > >
> > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> > > > more flume.log
> > > > 2012-06-28 14:17:31,556 INFO lifecycle.LifecycleSupervisor: Starting
> > > > lifecycle supervisor 1
> > > > 2012-06-28 14:17:31,558 INFO node.FlumeNode: Flume node starting -
> agent1
> > > > 2012-06-28 14:17:31,559 INFO nodemanager.DefaultLogicalNodeManager:
> > > > Node manager starting
> > > > 2012-06-28 14:17:31,559 INFO lifecycle.LifecycleSupervisor: Starting
> > > > lifecycle supervisor 10
> > > > 2012-06-28 14:17:31,560 INFO
> > > > properties.PropertiesFileConfigurationProvider: Configuration
> provider
> > > > starting
> > > > 2012-06-28 14:17:31,561 INFO
> > > > properties.PropertiesFileConfigurationProvider: Reloading
> > > > configuration file:conf/agent1.conf
> > > > 2012-06-28 14:17:31,566 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Processing:HDFS
> > > > 2012-06-28 14:17:31,567 INFO conf.FlumeConfiguration: Added sinks:
> > > > HDFS Agent: agent1
> > > > 2012-06-28 14:17:31,582 INFO conf.FlumeConfiguration: Post-validation
> > > > flume configuration contains configuration for agents: [agent1]
> > > > 2012-06-28 14:17:31,582 INFO
> > > > properties.PropertiesFileConfigurationProvider: Creating channels
> > > > 2012-06-28 14:17:31,587 INFO
> > > > properties.PropertiesFileConfigurationProvider: created channel
> > > > MemoryChannel-2
> > > > 2012-06-28 14:17:31,595 INFO sink.DefaultSinkFactory: Creating
> > > > instance of sink HDFS typehdfs
> > > > 2012-06-28 14:17:31,599 ERROR
> > > > properties.PropertiesFileConfigurationProvider: Failed to start agent
> > > > because dependencies were not found in classpath. Error f
> > > > ollows.
> > > > java.lang.NoClassDefFoundError:
> > > > org/apache/hadoop/io/SequenceFile$CompressionType
> > > > at
> > > >
> org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:204)
> > > > at
> org.apache.flume.conf.Configurables.configure(Configurables.java:41)
> > > > at
> > > >
> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.loadSinks(PropertiesFileConfigurationProvider.java:373)
> > > > at
> > > >
> org.apache.flume.conf.properties.PropertiesFileConfigurationProvider.load(PropertiesFileConfigurationProvider.java:223)
> > > > at
> > > >
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(AbstractFileConfigurationProvider.java:123)
> > > > at
> > > >
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$300(AbstractFileConfigurationProvider.java:38)
> > > > at
> > > >
> org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigurationProvider.java:202)
> > > > at
> > > >
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> > > > at
> > > >
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> > > > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> > > > at
> > > >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> > > > at
> > > >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> > > > at
> > > >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> > > > at
> > > >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > > > at
> > > >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > > > at java.lang.Thread.run(Thread.java:662)
> > > > Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.io.SequenceFile$CompressionType
> > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > in which agent file I can add the -Dflume.root.logger=INFO,console
> > > > command?.
> > > >
> > > > Please help me on this issue.
> > > >
> > > >
> > > > Thanks,
> > > > Vijay
> > > >
> > > >
> > > > On 6/28/12, Hari Shreedharan <hshreedharan@cloudera.com (mailto:
> hshreedharan@cloudera.com)
> > > > (mailto:hshreedharan@cloudera.com)> wrote:
> > > > > Yes, Mike is right. I missed the "console."
> > > > >
> > > > > Thanks
> > > > > Hari
> > > > > --
> > > > > Hari Shreedharan
> > > > >
> > > > >
> > > > > On Wednesday, June 27, 2012 at 11:55 PM, Mike Percy wrote:
> > > > >
> > > > > > I think there was a minor typo there in the email, it should be
> > > > > > -Dflume.root.logger=INFO,console
> > > > > >
> > > > > > Regards,
> > > > > > Mike
> > > > > >
> > > > > > On Wednesday, June 27, 2012, vijay k wrote:
> > > > > > > Thanks for the reply Hari,
> > > > > > >
> > > > > > > I will try the below command, and let you know the result.
> > > > > > >
> > > > > > > On Thu, Jun 28, 2012 at 7:52 AM, Hari Shreedharan
> > > > > > > <hshreedharan@cloudera.com (mailto:hshreedharan@cloudera.com)>
> > > > > > > wrote:
> > > > > > > > Vijay,
> > > > > > > >
> > > > > > > > You are asking flume to look at /conf(rather than ./conf)
> for the
> > > > > > > > log4j properties file. Please change the command to:
> > > > > > > >
> > > > > > > > bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
> > > > > > > >
> > > > > > > > Also please remove the line you added to the log4j properties
> > > > > > > > file. It
> > > > > > > > is not valid because LifecycleSupervisor is not a
> Log4jAppender.
> > > > > > > > Just
> > > > > > > > leave the config as specified and you will see the log in the
> > > > > > > > same
> > > > > > > > folder you are running the agent from, or specify
> > > > > > > > -Dflume.root.logger=INFO in the flume agent command,console
> to
> > > > > > > > have
> > > > > > > > flume dump the logs to console.
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > Thanks
> > > > > > > > Hari
> > > > > > > >
> > > > > > > >
> > > > > > > > --
> > > > > > > > Hari Shreedharan
> > > > > > > >
> > > > > > > >
> > > > > > > > On Wednesday, June 27, 2012 at 7:10 PM, vijay k wrote:
> > > > > > > >
> > > > > > > > > Can anyone respond on the below issue?
> > > > > > > > >
> > > > > > > > > On Tue, Jun 26, 2012 at 8:20 PM, vijay k <
> k.vijay52@gmail.com (mailto:k.vijay52@gmail.com)
> > > > > > > > > (mailto:k.vijay52@gmail.com)>
> > > > > > > > > wrote:
> > > > > > > > > >
> > > > > > > > > > Hi,
> > > > > > > > > > I have run the flume-ng, but it is not moving forward,
> it's
> > > > > > > > > > got
> > > > > > > > > > hang up, below are the my agent1.conf config file
> > > > > > > > > >
> > > > > > > > > > agent1.conf configuaration
> > > > > > > > > > ---------------------------------
> > > > > > > > > >
> > > > > > > > > > agent1.sources = tail
> > > > > > > > > > agent1.channels = MemoryChannel-2
> > > > > > > > > > agent1.sinks = HDFS
> > > > > > > > > > agent1.sources.tail.type = exec
> > > > > > > > > > agent1.sources.tail.command = tail -F /var/log/syslog.1
> > > > > > > > > > agent1.sources.tail.channels = MemoryChannel-2
> > > > > > > > > > agent1.sinks.HDFS.channel = MemoryChannel-2
> > > > > > > > > > agent1.sinks.HDFS.type = hdfs
> > > > > > > > > > agent1.sinks.HDFS.hdfs.path = hdfs://
> 10.5.114.110:9000/flume
> > > > > > > > > > (http://10.5.114.110:9000/flume)
> > > > > > > > > > agent1.sinks.HDFS.hdfs.file.Type = DataStream
> > > > > > > > > > agent1.channels.MemoryChannel-2.type = memory
> > > > > > > > > >
> > > > > > > > > > I have run agent1.conf by using following command:
> > > > > > > > > >
> > > > > > > > > > #bin/flume-ng agent -n agent1 -c /conf -f
> conf/agent1.conf
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf#
> > > > > > > > > > ls -lrt
> > > > > > > > > > total 24
> > > > > > > > > > -rw-r--r-- 1 root root 2070 2012-06-26 12:57
> log4j.properties
> > > > > > > > > > -rw-r--r-- 1 root root 1132 2012-06-26 12:57
> > > > > > > > > > flume-env.sh.template
> > > > > > > > > > -rw-r--r-- 1 root root 1661 2012-06-26 12:57
> > > > > > > > > > flume-conf.properties.template
> > > > > > > > > > -rw-r--r-- 1 root root 1661 2012-06-26 19:35 flume.conf
> > > > > > > > > > -rw-r--r-- 1 root root 1132 2012-06-26 19:36
> flume-env.sh (http://flume-env.sh)
> > > > > > > > > > (http://flume-env.sh)
> > > > > > > > > > (http://flume-env.sh)
> > > > > > > > > > -rw-r--r-- 1 root root 438 2012-06-26 19:38 agent1.conf
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf#
> > > > > > > > > > chmod 775 agent1.conf
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf#
> > > > > > > > > > cd ..
> > > > > > > > > >
> > > > > > > > > > Here, i am getting the following error.
> > > > > > > > > >
> > > > > > > > > > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> > > > > > > > > > bin/flume-ng agent -n agent1 -c /conf -f conf/agent1.conf
> > > > > > > > > > + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
> > > > > > > > > >
> '/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*'
> > > > > > > > > > -Djava.library.path= org.apache.flume.node.Application -n
> > > > > > > > > > agent1
> > > > > > > > > > -f conf/agent1.conf
> > > > > > > > > > log4j:WARN No appenders could be found for logger
> > > > > > > > > > (org.apache.flume.lifecycle.LifecycleSupervisor).
> > > > > > > > > > log4j:WARN Please initialize the log4j system properly.
> > > > > > > > > > log4j:WARN See http://logging.
> > > > > > > > > > (http://logging.apache.org/log4j/1.2/faq.html#noconfig)
> > > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>
>
>
>

Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
Yes Mike you are correct, i have executed the "$HADOOP_HOME/bin/hadoop
classpath" command, i am getting following error.

hduser@md-trngpoc1:/usr/local/hadoop$ bin/hadoop classpath
Exception in thread "main" java.lang.NoClassDefFoundError: classpath
Caused by: java.lang.ClassNotFoundException: classpath
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: classpath.  Program will exit.
that is the reason, I could not able to run the flume-ng?. any way i am
going to install hadoop stable 1.0.3 version.

Thanks a lot for your help.
Vijay


On Fri, Jul 6, 2012 at 12:11 AM, Mike Percy <mp...@cloudera.com> wrote:

> Vijay, I agree with Alex & Mohammed, your Hadoop installation seems messed
> up. Either that, or you have a copy & paste error in your output...
>
> Also, apparently you are trying to run Hadoop 0.20.204? It would be much
> better for you to either use a stable version of Hadoop, such as 1.0.3, or
> use the Hadoop version from CDH 3 or 4. I think 0.20.204 may even be
> missing hsync (I know it's there in 0.20.205), and if so then the Flume
> HDFS Sink will not work.
>
> At a minimum, ensure that your HADOOP_HOME is pointing to a location so
> that you can execute "$HADOOP_HOME/bin/hadoop classpath" and it will output
> some jars, etc. The output you showed below looks like it should be the
> output of "which hadoop"... if it's not then the hadoop classpath command
> is broken in your version of hadoop and since Flume depends on it working,
> you are kind of out of luck without hacking up the start script or putting
> jars and config files into the Flume lib directory.
>
> Regards,
> Mike
>
>
> On Thu, Jul 5, 2012 at 5:08 AM, Mohammad Tariq <do...@gmail.com> wrote:
>
>> If you are using CDH then it's always better to uninstall the previous
>> version. If it is Apache Hadoop then it's just a matter of changing
>> the directory.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jul 5, 2012 at 5:28 PM, vijay k <k....@gmail.com> wrote:
>> > Hi,
>> >
>> > with out removing older version, shall i install hadoop 0.20.204 in
>> another
>> > folder?
>> >
>> > Thanks,
>> > vijay
>> >
>> > On Wed, Jul 4, 2012 at 7:26 PM, alo alt <wg...@gmail.com> wrote:
>> >>
>> >> Yes.
>> >>
>> >> Follow these link:
>> >> http://hadoop.apache.org/common/docs/r0.20.204.0/
>> >>
>> >> to install a single node or cluster. Note, you need on the sink side
>> the
>> >> same hdfs version as you've in your cluster.
>> >>
>> >> - Alex
>> >>
>> >>
>> >> On Jul 4, 2012, at 3:51 PM, vijay k wrote:
>> >>
>> >> > I have already downloaded the hadoop-0.20.204.0.tar.gz,
>> >> >
>> >> > shall i go ahead  with hadoop 0.20.204 ver installation. please
>> confirm
>> >> > it
>> >> > is compatible with flume 1.2.0 version?
>> >> >
>> >> > Thanks,
>> >> > Vijay
>> >> >
>> >> > On Wed, Jul 4, 2012 at 7:03 PM, alo alt <wg...@gmail.com> wrote:
>> >> >
>> >> >> 1. Flume uses the libs from the classpath
>> >> >> 2. remove the installed hadoop version (apt-get remove hadoop*)
>> >> >> 3. http://hadoop.apache.org/common/docs/   => click at the version
>> you
>> >> >> want to install and follow the steps described there
>> >> >>
>> >> >> - Alex
>> >> >>
>> >> >> On Jul 4, 2012, at 2:36 PM, vijay k wrote:
>> >> >>
>> >> >>> Hi Alex,
>> >> >>>
>> >> >>> I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu,
>> but i
>> >> >>> could not able to find the *hadoop-install* directory.
>> >> >>> Please let me know the following query's
>> >> >>>
>> >> >>> 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
>> >> >>> 2. if need to remove my hadoop 0.20.0 version, how to remove the
>> older
>> >> >>> version.
>> >> >>> 3. and provide the hadoop latest version installation steps.
>> >> >>>
>> >> >>> Thanks a lot for your comments.
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>> On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com>
>> wrote:
>> >> >>>
>> >> >>>> U're on Ubuntu / Debian?
>> >> >>>>
>> >> >>>> If yes, remove the hadoop installation via apt, you have an really
>> >> >>>> old
>> >> >> one
>> >> >>>> installed. Take a look into the /usr/local/hadoop/lib directory
>> (as
>> >> >>>> you
>> >> >>>> posted before):
>> >> >>>>
>> >> >>>> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25
>> xmlenc-0.52.jar
>> >> >>>> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25
>> >> >> slf4j-log4j12-1.4.3.jar
>> >> >>>>
>> >> >>>> and so on. From 2010 - pretty old.
>> >> >>>>
>> >> >>>> How do you installed hadoop? I think you've done already from the
>> >> >> tarball
>> >> >>>> into /hadoop-install/hadoop - there should be a ../lib dir too.
>> >> >>>> Export
>> >> >>>> these and try again.
>> >> >>>>
>> >> >>>> - Alex
>> >> >>>>
>> >> >>>>
>> >> >>>> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
>> >> >>>>
>> >> >>>>> Hi Alex,
>> >> >>>>>
>> >> >>>>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but
>> >> >>>>> still
>> >> >> am
>> >> >>>>> unable to proceed further.
>> >> >>>>> Please find the attached flume-ng script. Please guide me if i
>> did
>> >> >>>> anything
>> >> >>>>> wrong in the script
>> >> >>>>>
>> >> >>>>> Thanks a lot for your helping to proceed further.
>> >> >>>>>
>> >> >>>>>
>> >> >>>>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com>
>> wrote:
>> >> >>>>>
>> >> >>>>>> Hi Vijay,
>> >> >>>>>>
>> >> >>>>>> but you've set the path to:
>> >> >>>>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>> >> >>>>>>
>> >> >>>>>> should be: /usr/local/hadoop/lib
>> >> >>>>>>
>> >> >>>>>> Flume reads out the variable and look into that path for all
>> needed
>> >> >>>>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found
>> >> >> nothing.
>> >> >>>>>>
>> >> >>>>>> - Alex
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
>> >> >>>>>>
>> >> >>>>>>> Hi Alex,
>> >> >>>>>>>
>> >> >>>>>>> Hadoop lib are installed following path
>> >> >>>>>>>
>> >> >>>>>>> root@md-trngpoc1
>> >> >>>>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
>> >> >>>>>>> cd /usr/local/hadoop/lib
>> >> >>>>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
>> >> >>>>>>> total 8428
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>>> --
>> >> >>>>>> Alexander Alten-Lorenz
>> >> >>>>>> http://mapredit.blogspot.com
>> >> >>>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>> <flume-ng script.txt>
>> >> >>>>
>> >> >>>>
>> >> >>>> --
>> >> >>>> Alexander Alten-Lorenz
>> >> >>>> http://mapredit.blogspot.com
>> >> >>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >> >>>>
>> >> >>>>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Alexander Alten-Lorenz
>> >> >> http://mapredit.blogspot.com
>> >> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >> >>
>> >> >>
>> >>
>> >>
>> >> --
>> >> Alexander Alten-Lorenz
>> >> http://mapredit.blogspot.com
>> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >>
>> >
>>
>
>

Re: Flume agent failure

Posted by Mike Percy <mp...@cloudera.com>.
Vijay, I agree with Alex & Mohammed, your Hadoop installation seems messed
up. Either that, or you have a copy & paste error in your output...

Also, apparently you are trying to run Hadoop 0.20.204? It would be much
better for you to either use a stable version of Hadoop, such as 1.0.3, or
use the Hadoop version from CDH 3 or 4. I think 0.20.204 may even be
missing hsync (I know it's there in 0.20.205), and if so then the Flume
HDFS Sink will not work.

At a minimum, ensure that your HADOOP_HOME is pointing to a location so
that you can execute "$HADOOP_HOME/bin/hadoop classpath" and it will output
some jars, etc. The output you showed below looks like it should be the
output of "which hadoop"... if it's not then the hadoop classpath command
is broken in your version of hadoop and since Flume depends on it working,
you are kind of out of luck without hacking up the start script or putting
jars and config files into the Flume lib directory.

Regards,
Mike

On Thu, Jul 5, 2012 at 5:08 AM, Mohammad Tariq <do...@gmail.com> wrote:

> If you are using CDH then it's always better to uninstall the previous
> version. If it is Apache Hadoop then it's just a matter of changing
> the directory.
>
> Regards,
>     Mohammad Tariq
>
>
> On Thu, Jul 5, 2012 at 5:28 PM, vijay k <k....@gmail.com> wrote:
> > Hi,
> >
> > with out removing older version, shall i install hadoop 0.20.204 in
> another
> > folder?
> >
> > Thanks,
> > vijay
> >
> > On Wed, Jul 4, 2012 at 7:26 PM, alo alt <wg...@gmail.com> wrote:
> >>
> >> Yes.
> >>
> >> Follow these link:
> >> http://hadoop.apache.org/common/docs/r0.20.204.0/
> >>
> >> to install a single node or cluster. Note, you need on the sink side the
> >> same hdfs version as you've in your cluster.
> >>
> >> - Alex
> >>
> >>
> >> On Jul 4, 2012, at 3:51 PM, vijay k wrote:
> >>
> >> > I have already downloaded the hadoop-0.20.204.0.tar.gz,
> >> >
> >> > shall i go ahead  with hadoop 0.20.204 ver installation. please
> confirm
> >> > it
> >> > is compatible with flume 1.2.0 version?
> >> >
> >> > Thanks,
> >> > Vijay
> >> >
> >> > On Wed, Jul 4, 2012 at 7:03 PM, alo alt <wg...@gmail.com> wrote:
> >> >
> >> >> 1. Flume uses the libs from the classpath
> >> >> 2. remove the installed hadoop version (apt-get remove hadoop*)
> >> >> 3. http://hadoop.apache.org/common/docs/   => click at the version
> you
> >> >> want to install and follow the steps described there
> >> >>
> >> >> - Alex
> >> >>
> >> >> On Jul 4, 2012, at 2:36 PM, vijay k wrote:
> >> >>
> >> >>> Hi Alex,
> >> >>>
> >> >>> I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu,
> but i
> >> >>> could not able to find the *hadoop-install* directory.
> >> >>> Please let me know the following query's
> >> >>>
> >> >>> 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
> >> >>> 2. if need to remove my hadoop 0.20.0 version, how to remove the
> older
> >> >>> version.
> >> >>> 3. and provide the hadoop latest version installation steps.
> >> >>>
> >> >>> Thanks a lot for your comments.
> >> >>>
> >> >>>
> >> >>>
> >> >>> On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com>
> wrote:
> >> >>>
> >> >>>> U're on Ubuntu / Debian?
> >> >>>>
> >> >>>> If yes, remove the hadoop installation via apt, you have an really
> >> >>>> old
> >> >> one
> >> >>>> installed. Take a look into the /usr/local/hadoop/lib directory (as
> >> >>>> you
> >> >>>> posted before):
> >> >>>>
> >> >>>> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
> >> >>>> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25
> >> >> slf4j-log4j12-1.4.3.jar
> >> >>>>
> >> >>>> and so on. From 2010 - pretty old.
> >> >>>>
> >> >>>> How do you installed hadoop? I think you've done already from the
> >> >> tarball
> >> >>>> into /hadoop-install/hadoop - there should be a ../lib dir too.
> >> >>>> Export
> >> >>>> these and try again.
> >> >>>>
> >> >>>> - Alex
> >> >>>>
> >> >>>>
> >> >>>> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
> >> >>>>
> >> >>>>> Hi Alex,
> >> >>>>>
> >> >>>>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but
> >> >>>>> still
> >> >> am
> >> >>>>> unable to proceed further.
> >> >>>>> Please find the attached flume-ng script. Please guide me if i did
> >> >>>> anything
> >> >>>>> wrong in the script
> >> >>>>>
> >> >>>>> Thanks a lot for your helping to proceed further.
> >> >>>>>
> >> >>>>>
> >> >>>>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com>
> wrote:
> >> >>>>>
> >> >>>>>> Hi Vijay,
> >> >>>>>>
> >> >>>>>> but you've set the path to:
> >> >>>>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
> >> >>>>>>
> >> >>>>>> should be: /usr/local/hadoop/lib
> >> >>>>>>
> >> >>>>>> Flume reads out the variable and look into that path for all
> needed
> >> >>>>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found
> >> >> nothing.
> >> >>>>>>
> >> >>>>>> - Alex
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
> >> >>>>>>
> >> >>>>>>> Hi Alex,
> >> >>>>>>>
> >> >>>>>>> Hadoop lib are installed following path
> >> >>>>>>>
> >> >>>>>>> root@md-trngpoc1
> >> >>>>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
> >> >>>>>>> cd /usr/local/hadoop/lib
> >> >>>>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
> >> >>>>>>> total 8428
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> --
> >> >>>>>> Alexander Alten-Lorenz
> >> >>>>>> http://mapredit.blogspot.com
> >> >>>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >> >>>>>>
> >> >>>>>>
> >> >>>>> <flume-ng script.txt>
> >> >>>>
> >> >>>>
> >> >>>> --
> >> >>>> Alexander Alten-Lorenz
> >> >>>> http://mapredit.blogspot.com
> >> >>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >> >>>>
> >> >>>>
> >> >>
> >> >>
> >> >> --
> >> >> Alexander Alten-Lorenz
> >> >> http://mapredit.blogspot.com
> >> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >> >>
> >> >>
> >>
> >>
> >> --
> >> Alexander Alten-Lorenz
> >> http://mapredit.blogspot.com
> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>
> >
>

Re: Flume agent failure

Posted by Mohammad Tariq <do...@gmail.com>.
If you are using CDH then it's always better to uninstall the previous
version. If it is Apache Hadoop then it's just a matter of changing
the directory.

Regards,
    Mohammad Tariq


On Thu, Jul 5, 2012 at 5:28 PM, vijay k <k....@gmail.com> wrote:
> Hi,
>
> with out removing older version, shall i install hadoop 0.20.204 in another
> folder?
>
> Thanks,
> vijay
>
> On Wed, Jul 4, 2012 at 7:26 PM, alo alt <wg...@gmail.com> wrote:
>>
>> Yes.
>>
>> Follow these link:
>> http://hadoop.apache.org/common/docs/r0.20.204.0/
>>
>> to install a single node or cluster. Note, you need on the sink side the
>> same hdfs version as you've in your cluster.
>>
>> - Alex
>>
>>
>> On Jul 4, 2012, at 3:51 PM, vijay k wrote:
>>
>> > I have already downloaded the hadoop-0.20.204.0.tar.gz,
>> >
>> > shall i go ahead  with hadoop 0.20.204 ver installation. please confirm
>> > it
>> > is compatible with flume 1.2.0 version?
>> >
>> > Thanks,
>> > Vijay
>> >
>> > On Wed, Jul 4, 2012 at 7:03 PM, alo alt <wg...@gmail.com> wrote:
>> >
>> >> 1. Flume uses the libs from the classpath
>> >> 2. remove the installed hadoop version (apt-get remove hadoop*)
>> >> 3. http://hadoop.apache.org/common/docs/   => click at the version you
>> >> want to install and follow the steps described there
>> >>
>> >> - Alex
>> >>
>> >> On Jul 4, 2012, at 2:36 PM, vijay k wrote:
>> >>
>> >>> Hi Alex,
>> >>>
>> >>> I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu, but i
>> >>> could not able to find the *hadoop-install* directory.
>> >>> Please let me know the following query's
>> >>>
>> >>> 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
>> >>> 2. if need to remove my hadoop 0.20.0 version, how to remove the older
>> >>> version.
>> >>> 3. and provide the hadoop latest version installation steps.
>> >>>
>> >>> Thanks a lot for your comments.
>> >>>
>> >>>
>> >>>
>> >>> On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com> wrote:
>> >>>
>> >>>> U're on Ubuntu / Debian?
>> >>>>
>> >>>> If yes, remove the hadoop installation via apt, you have an really
>> >>>> old
>> >> one
>> >>>> installed. Take a look into the /usr/local/hadoop/lib directory (as
>> >>>> you
>> >>>> posted before):
>> >>>>
>> >>>> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
>> >>>> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25
>> >> slf4j-log4j12-1.4.3.jar
>> >>>>
>> >>>> and so on. From 2010 - pretty old.
>> >>>>
>> >>>> How do you installed hadoop? I think you've done already from the
>> >> tarball
>> >>>> into /hadoop-install/hadoop - there should be a ../lib dir too.
>> >>>> Export
>> >>>> these and try again.
>> >>>>
>> >>>> - Alex
>> >>>>
>> >>>>
>> >>>> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
>> >>>>
>> >>>>> Hi Alex,
>> >>>>>
>> >>>>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but
>> >>>>> still
>> >> am
>> >>>>> unable to proceed further.
>> >>>>> Please find the attached flume-ng script. Please guide me if i did
>> >>>> anything
>> >>>>> wrong in the script
>> >>>>>
>> >>>>> Thanks a lot for your helping to proceed further.
>> >>>>>
>> >>>>>
>> >>>>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
>> >>>>>
>> >>>>>> Hi Vijay,
>> >>>>>>
>> >>>>>> but you've set the path to:
>> >>>>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>> >>>>>>
>> >>>>>> should be: /usr/local/hadoop/lib
>> >>>>>>
>> >>>>>> Flume reads out the variable and look into that path for all needed
>> >>>>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found
>> >> nothing.
>> >>>>>>
>> >>>>>> - Alex
>> >>>>>>
>> >>>>>>
>> >>>>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
>> >>>>>>
>> >>>>>>> Hi Alex,
>> >>>>>>>
>> >>>>>>> Hadoop lib are installed following path
>> >>>>>>>
>> >>>>>>> root@md-trngpoc1
>> >>>>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
>> >>>>>>> cd /usr/local/hadoop/lib
>> >>>>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
>> >>>>>>> total 8428
>> >>>>>>
>> >>>>>>
>> >>>>>> --
>> >>>>>> Alexander Alten-Lorenz
>> >>>>>> http://mapredit.blogspot.com
>> >>>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >>>>>>
>> >>>>>>
>> >>>>> <flume-ng script.txt>
>> >>>>
>> >>>>
>> >>>> --
>> >>>> Alexander Alten-Lorenz
>> >>>> http://mapredit.blogspot.com
>> >>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >>>>
>> >>>>
>> >>
>> >>
>> >> --
>> >> Alexander Alten-Lorenz
>> >> http://mapredit.blogspot.com
>> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >>
>> >>
>>
>>
>> --
>> Alexander Alten-Lorenz
>> http://mapredit.blogspot.com
>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>>
>

Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
Hi,

with out removing older version, shall i install hadoop 0.20.204 in another
folder?

Thanks,
vijay

On Wed, Jul 4, 2012 at 7:26 PM, alo alt <wg...@gmail.com> wrote:

> Yes.
>
> Follow these link:
> http://hadoop.apache.org/common/docs/r0.20.204.0/
>
> to install a single node or cluster. Note, you need on the sink side the
> same hdfs version as you've in your cluster.
>
> - Alex
>
>
> On Jul 4, 2012, at 3:51 PM, vijay k wrote:
>
> > I have already downloaded the hadoop-0.20.204.0.tar.gz,
> >
> > shall i go ahead  with hadoop 0.20.204 ver installation. please confirm
> it
> > is compatible with flume 1.2.0 version?
> >
> > Thanks,
> > Vijay
> >
> > On Wed, Jul 4, 2012 at 7:03 PM, alo alt <wg...@gmail.com> wrote:
> >
> >> 1. Flume uses the libs from the classpath
> >> 2. remove the installed hadoop version (apt-get remove hadoop*)
> >> 3. http://hadoop.apache.org/common/docs/   => click at the version you
> >> want to install and follow the steps described there
> >>
> >> - Alex
> >>
> >> On Jul 4, 2012, at 2:36 PM, vijay k wrote:
> >>
> >>> Hi Alex,
> >>>
> >>> I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu, but i
> >>> could not able to find the *hadoop-install* directory.
> >>> Please let me know the following query's
> >>>
> >>> 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
> >>> 2. if need to remove my hadoop 0.20.0 version, how to remove the older
> >>> version.
> >>> 3. and provide the hadoop latest version installation steps.
> >>>
> >>> Thanks a lot for your comments.
> >>>
> >>>
> >>>
> >>> On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com> wrote:
> >>>
> >>>> U're on Ubuntu / Debian?
> >>>>
> >>>> If yes, remove the hadoop installation via apt, you have an really old
> >> one
> >>>> installed. Take a look into the /usr/local/hadoop/lib directory (as
> you
> >>>> posted before):
> >>>>
> >>>> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
> >>>> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25
> >> slf4j-log4j12-1.4.3.jar
> >>>>
> >>>> and so on. From 2010 - pretty old.
> >>>>
> >>>> How do you installed hadoop? I think you've done already from the
> >> tarball
> >>>> into /hadoop-install/hadoop - there should be a ../lib dir too. Export
> >>>> these and try again.
> >>>>
> >>>> - Alex
> >>>>
> >>>>
> >>>> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
> >>>>
> >>>>> Hi Alex,
> >>>>>
> >>>>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still
> >> am
> >>>>> unable to proceed further.
> >>>>> Please find the attached flume-ng script. Please guide me if i did
> >>>> anything
> >>>>> wrong in the script
> >>>>>
> >>>>> Thanks a lot for your helping to proceed further.
> >>>>>
> >>>>>
> >>>>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
> >>>>>
> >>>>>> Hi Vijay,
> >>>>>>
> >>>>>> but you've set the path to:
> >>>>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
> >>>>>>
> >>>>>> should be: /usr/local/hadoop/lib
> >>>>>>
> >>>>>> Flume reads out the variable and look into that path for all needed
> >>>>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found
> >> nothing.
> >>>>>>
> >>>>>> - Alex
> >>>>>>
> >>>>>>
> >>>>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
> >>>>>>
> >>>>>>> Hi Alex,
> >>>>>>>
> >>>>>>> Hadoop lib are installed following path
> >>>>>>>
> >>>>>>> root@md-trngpoc1
> >>>>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
> >>>>>>> cd /usr/local/hadoop/lib
> >>>>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
> >>>>>>> total 8428
> >>>>>>
> >>>>>>
> >>>>>> --
> >>>>>> Alexander Alten-Lorenz
> >>>>>> http://mapredit.blogspot.com
> >>>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>>>>>
> >>>>>>
> >>>>> <flume-ng script.txt>
> >>>>
> >>>>
> >>>> --
> >>>> Alexander Alten-Lorenz
> >>>> http://mapredit.blogspot.com
> >>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>>>
> >>>>
> >>
> >>
> >> --
> >> Alexander Alten-Lorenz
> >> http://mapredit.blogspot.com
> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>
> >>
>
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>

Re: Flume agent failure

Posted by alo alt <wg...@gmail.com>.
Yes. 

Follow these link:
http://hadoop.apache.org/common/docs/r0.20.204.0/

to install a single node or cluster. Note, you need on the sink side the same hdfs version as you've in your cluster. 

- Alex


On Jul 4, 2012, at 3:51 PM, vijay k wrote:

> I have already downloaded the hadoop-0.20.204.0.tar.gz,
> 
> shall i go ahead  with hadoop 0.20.204 ver installation. please confirm it
> is compatible with flume 1.2.0 version?
> 
> Thanks,
> Vijay
> 
> On Wed, Jul 4, 2012 at 7:03 PM, alo alt <wg...@gmail.com> wrote:
> 
>> 1. Flume uses the libs from the classpath
>> 2. remove the installed hadoop version (apt-get remove hadoop*)
>> 3. http://hadoop.apache.org/common/docs/   => click at the version you
>> want to install and follow the steps described there
>> 
>> - Alex
>> 
>> On Jul 4, 2012, at 2:36 PM, vijay k wrote:
>> 
>>> Hi Alex,
>>> 
>>> I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu, but i
>>> could not able to find the *hadoop-install* directory.
>>> Please let me know the following query's
>>> 
>>> 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
>>> 2. if need to remove my hadoop 0.20.0 version, how to remove the older
>>> version.
>>> 3. and provide the hadoop latest version installation steps.
>>> 
>>> Thanks a lot for your comments.
>>> 
>>> 
>>> 
>>> On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com> wrote:
>>> 
>>>> U're on Ubuntu / Debian?
>>>> 
>>>> If yes, remove the hadoop installation via apt, you have an really old
>> one
>>>> installed. Take a look into the /usr/local/hadoop/lib directory (as you
>>>> posted before):
>>>> 
>>>> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
>>>> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25
>> slf4j-log4j12-1.4.3.jar
>>>> 
>>>> and so on. From 2010 - pretty old.
>>>> 
>>>> How do you installed hadoop? I think you've done already from the
>> tarball
>>>> into /hadoop-install/hadoop - there should be a ../lib dir too. Export
>>>> these and try again.
>>>> 
>>>> - Alex
>>>> 
>>>> 
>>>> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
>>>> 
>>>>> Hi Alex,
>>>>> 
>>>>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still
>> am
>>>>> unable to proceed further.
>>>>> Please find the attached flume-ng script. Please guide me if i did
>>>> anything
>>>>> wrong in the script
>>>>> 
>>>>> Thanks a lot for your helping to proceed further.
>>>>> 
>>>>> 
>>>>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
>>>>> 
>>>>>> Hi Vijay,
>>>>>> 
>>>>>> but you've set the path to:
>>>>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>>>>>> 
>>>>>> should be: /usr/local/hadoop/lib
>>>>>> 
>>>>>> Flume reads out the variable and look into that path for all needed
>>>>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found
>> nothing.
>>>>>> 
>>>>>> - Alex
>>>>>> 
>>>>>> 
>>>>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
>>>>>> 
>>>>>>> Hi Alex,
>>>>>>> 
>>>>>>> Hadoop lib are installed following path
>>>>>>> 
>>>>>>> root@md-trngpoc1
>>>>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
>>>>>>> cd /usr/local/hadoop/lib
>>>>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
>>>>>>> total 8428
>>>>>> 
>>>>>> 
>>>>>> --
>>>>>> Alexander Alten-Lorenz
>>>>>> http://mapredit.blogspot.com
>>>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>>>>>> 
>>>>>> 
>>>>> <flume-ng script.txt>
>>>> 
>>>> 
>>>> --
>>>> Alexander Alten-Lorenz
>>>> http://mapredit.blogspot.com
>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>>>> 
>>>> 
>> 
>> 
>> --
>> Alexander Alten-Lorenz
>> http://mapredit.blogspot.com
>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> 
>> 


--
Alexander Alten-Lorenz
http://mapredit.blogspot.com
German Hadoop LinkedIn Group: http://goo.gl/N8pCF


Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
I have already downloaded the hadoop-0.20.204.0.tar.gz,

shall i go ahead  with hadoop 0.20.204 ver installation. please confirm it
is compatible with flume 1.2.0 version?

Thanks,
Vijay

On Wed, Jul 4, 2012 at 7:03 PM, alo alt <wg...@gmail.com> wrote:

> 1. Flume uses the libs from the classpath
> 2. remove the installed hadoop version (apt-get remove hadoop*)
> 3. http://hadoop.apache.org/common/docs/   => click at the version you
> want to install and follow the steps described there
>
> - Alex
>
> On Jul 4, 2012, at 2:36 PM, vijay k wrote:
>
> > Hi Alex,
> >
> > I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu, but i
> > could not able to find the *hadoop-install* directory.
>  > Please let me know the following query's
> >
> > 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
> > 2. if need to remove my hadoop 0.20.0 version, how to remove the older
> > version.
> > 3. and provide the hadoop latest version installation steps.
> >
> > Thanks a lot for your comments.
> >
> >
> >
> > On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com> wrote:
> >
> >> U're on Ubuntu / Debian?
> >>
> >> If yes, remove the hadoop installation via apt, you have an really old
> one
> >> installed. Take a look into the /usr/local/hadoop/lib directory (as you
> >> posted before):
> >>
> >> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
> >> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25
> slf4j-log4j12-1.4.3.jar
> >>
> >> and so on. From 2010 - pretty old.
> >>
> >> How do you installed hadoop? I think you've done already from the
> tarball
> >> into /hadoop-install/hadoop - there should be a ../lib dir too. Export
> >> these and try again.
> >>
> >> - Alex
> >>
> >>
> >> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
> >>
> >>> Hi Alex,
> >>>
> >>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still
> am
> >>> unable to proceed further.
> >>> Please find the attached flume-ng script. Please guide me if i did
> >> anything
> >>> wrong in the script
> >>>
> >>> Thanks a lot for your helping to proceed further.
> >>>
> >>>
> >>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
> >>>
> >>>> Hi Vijay,
> >>>>
> >>>> but you've set the path to:
> >>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
> >>>>
> >>>> should be: /usr/local/hadoop/lib
> >>>>
> >>>> Flume reads out the variable and look into that path for all needed
> >>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found
> nothing.
> >>>>
> >>>> - Alex
> >>>>
> >>>>
> >>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
> >>>>
> >>>>> Hi Alex,
> >>>>>
> >>>>> Hadoop lib are installed following path
> >>>>>
> >>>>> root@md-trngpoc1
> >>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
> >>>>> cd /usr/local/hadoop/lib
> >>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
> >>>>> total 8428
> >>>>
> >>>>
> >>>> --
> >>>> Alexander Alten-Lorenz
> >>>> http://mapredit.blogspot.com
> >>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>>>
> >>>>
> >>> <flume-ng script.txt>
> >>
> >>
> >> --
> >> Alexander Alten-Lorenz
> >> http://mapredit.blogspot.com
> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>
> >>
>
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>

Re: Flume agent failure

Posted by alo alt <wg...@gmail.com>.
1. Flume uses the libs from the classpath
2. remove the installed hadoop version (apt-get remove hadoop*)
3. http://hadoop.apache.org/common/docs/   => click at the version you want to install and follow the steps described there

- Alex

On Jul 4, 2012, at 2:36 PM, vijay k wrote:

> Hi Alex,
> 
> I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu, but i
> could not able to find the *hadoop-install* directory.
> Please let me know the following query's
> 
> 1. which latest HADOOP version is compatible with FLUME 1.2.0 .
> 2. if need to remove my hadoop 0.20.0 version, how to remove the older
> version.
> 3. and provide the hadoop latest version installation steps.
> 
> Thanks a lot for your comments.
> 
> 
> 
> On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com> wrote:
> 
>> U're on Ubuntu / Debian?
>> 
>> If yes, remove the hadoop installation via apt, you have an really old one
>> installed. Take a look into the /usr/local/hadoop/lib directory (as you
>> posted before):
>> 
>> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
>> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25 slf4j-log4j12-1.4.3.jar
>> 
>> and so on. From 2010 - pretty old.
>> 
>> How do you installed hadoop? I think you've done already from the tarball
>> into /hadoop-install/hadoop - there should be a ../lib dir too. Export
>> these and try again.
>> 
>> - Alex
>> 
>> 
>> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
>> 
>>> Hi Alex,
>>> 
>>> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still am
>>> unable to proceed further.
>>> Please find the attached flume-ng script. Please guide me if i did
>> anything
>>> wrong in the script
>>> 
>>> Thanks a lot for your helping to proceed further.
>>> 
>>> 
>>> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
>>> 
>>>> Hi Vijay,
>>>> 
>>>> but you've set the path to:
>>>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>>>> 
>>>> should be: /usr/local/hadoop/lib
>>>> 
>>>> Flume reads out the variable and look into that path for all needed
>>>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found nothing.
>>>> 
>>>> - Alex
>>>> 
>>>> 
>>>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
>>>> 
>>>>> Hi Alex,
>>>>> 
>>>>> Hadoop lib are installed following path
>>>>> 
>>>>> root@md-trngpoc1
>>>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
>>>>> cd /usr/local/hadoop/lib
>>>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
>>>>> total 8428
>>>> 
>>>> 
>>>> --
>>>> Alexander Alten-Lorenz
>>>> http://mapredit.blogspot.com
>>>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>>>> 
>>>> 
>>> <flume-ng script.txt>
>> 
>> 
>> --
>> Alexander Alten-Lorenz
>> http://mapredit.blogspot.com
>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> 
>> 


--
Alexander Alten-Lorenz
http://mapredit.blogspot.com
German Hadoop LinkedIn Group: http://goo.gl/N8pCF


Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
Hi Alex,

I have installed hadoop via tar xzf hadoop-0.20.2.zip in Ubuntu, but i
could not able to find the *hadoop-install* directory.
Please let me know the following query's

1. which latest HADOOP version is compatible with FLUME 1.2.0 .
2. if need to remove my hadoop 0.20.0 version, how to remove the older
version.
3. and provide the hadoop latest version installation steps.

Thanks a lot for your comments.



On Wed, Jul 4, 2012 at 5:28 PM, alo alt <wg...@gmail.com> wrote:

> U're on Ubuntu / Debian?
>
> If yes, remove the hadoop installation via apt, you have an really old one
> installed. Take a look into the /usr/local/hadoop/lib directory (as you
> posted before):
>
> -rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
> -rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25 slf4j-log4j12-1.4.3.jar
>
> and so on. From 2010 - pretty old.
>
> How do you installed hadoop? I think you've done already from the tarball
> into /hadoop-install/hadoop - there should be a ../lib dir too. Export
> these and try again.
>
> - Alex
>
>
> On Jul 4, 2012, at 1:50 PM, vijay k wrote:
>
> > Hi Alex,
> >
> > I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still am
> > unable to proceed further.
> > Please find the attached flume-ng script. Please guide me if i did
> anything
> > wrong in the script
> >
> > Thanks a lot for your helping to proceed further.
> >
> >
> > On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
> >
> >> Hi Vijay,
> >>
> >> but you've set the path to:
> >>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
> >>
> >> should be: /usr/local/hadoop/lib
> >>
> >> Flume reads out the variable and look into that path for all needed
> >> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found nothing.
> >>
> >> - Alex
> >>
> >>
> >> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
> >>
> >>> Hi Alex,
> >>>
> >>> Hadoop lib are installed following path
> >>>
> >>> root@md-trngpoc1
> >> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
> >>> cd /usr/local/hadoop/lib
> >>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
> >>> total 8428
> >>
> >>
> >> --
> >> Alexander Alten-Lorenz
> >> http://mapredit.blogspot.com
> >> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
> >>
> >>
> > <flume-ng script.txt>
>
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>

Re: Flume agent failure

Posted by alo alt <wg...@gmail.com>.
U're on Ubuntu / Debian? 

If yes, remove the hadoop installation via apt, you have an really old one installed. Take a look into the /usr/local/hadoop/lib directory (as you posted before):

-rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
-rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25 slf4j-log4j12-1.4.3.jar

and so on. From 2010 - pretty old.

How do you installed hadoop? I think you've done already from the tarball into /hadoop-install/hadoop - there should be a ../lib dir too. Export these and try again.

- Alex


On Jul 4, 2012, at 1:50 PM, vijay k wrote:

> Hi Alex,
> 
> I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still am
> unable to proceed further.
> Please find the attached flume-ng script. Please guide me if i did anything
> wrong in the script
> 
> Thanks a lot for your helping to proceed further.
> 
> 
> On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:
> 
>> Hi Vijay,
>> 
>> but you've set the path to:
>>> hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>> 
>> should be: /usr/local/hadoop/lib
>> 
>> Flume reads out the variable and look into that path for all needed
>> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found nothing.
>> 
>> - Alex
>> 
>> 
>> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
>> 
>>> Hi Alex,
>>> 
>>> Hadoop lib are installed following path
>>> 
>>> root@md-trngpoc1
>> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
>>> cd /usr/local/hadoop/lib
>>> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
>>> total 8428
>> 
>> 
>> --
>> Alexander Alten-Lorenz
>> http://mapredit.blogspot.com
>> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> 
>> 
> <flume-ng script.txt>


--
Alexander Alten-Lorenz
http://mapredit.blogspot.com
German Hadoop LinkedIn Group: http://goo.gl/N8pCF


Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
Hi Alex,

I have set the HADOOP_CLASSPATH=/usr/local/hadoop/lib only, but still am
unable to proceed further.
Please find the attached flume-ng script. Please guide me if i did anything
wrong in the script

Thanks a lot for your helping to proceed further.


On Wed, Jul 4, 2012 at 5:08 PM, alo alt <wg...@gmail.com> wrote:

> Hi Vijay,
>
> but you've set the path to:
> > hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>
> should be: /usr/local/hadoop/lib
>
> Flume reads out the variable and look into that path for all needed
> classes / jars. But in /usr/local/hadoop/bin/hadoop flume found nothing.
>
> - Alex
>
>
> On Jul 4, 2012, at 1:33 PM, vijay k wrote:
>
> > Hi Alex,
> >
> > Hadoop lib are installed following path
> >
> > root@md-trngpoc1
> :/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
> > cd /usr/local/hadoop/lib
> > root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
> > total 8428
>
>
>  --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>

Re: Flume agent failure

Posted by alo alt <wg...@gmail.com>.
Hi Vijay,

but you've set the path to:
> hadoop classpath        -- /usr/local/hadoop/bin/hadoop

should be: /usr/local/hadoop/lib

Flume reads out the variable and look into that path for all needed classes / jars. But in /usr/local/hadoop/bin/hadoop flume found nothing.

- Alex


On Jul 4, 2012, at 1:33 PM, vijay k wrote:

> Hi Alex,
> 
> Hadoop lib are installed following path
> 
> root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
> cd /usr/local/hadoop/lib
> root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
> total 8428


--
Alexander Alten-Lorenz
http://mapredit.blogspot.com
German Hadoop LinkedIn Group: http://goo.gl/N8pCF


Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
Hi Alex,

Hadoop lib are installed following path

root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/bin#
cd /usr/local/hadoop/lib
root@md-trngpoc1:/usr/local/hadoop/lib# ls -lrt
total 8428
-rw-r--r-- 1 hduser hadoop   15010 2010-02-19 13:25 xmlenc-0.52.jar
-rw-r--r-- 1 hduser hadoop    8601 2010-02-19 13:25 slf4j-log4j12-1.4.3.jar
-rw-r--r-- 1 hduser hadoop   15345 2010-02-19 13:25 slf4j-api-1.4.3.jar
-rw-r--r-- 1 hduser hadoop  132368 2010-02-19 13:25
servlet-api-2.5-6.1.14.jar
-rw-r--r-- 1 hduser hadoop   65261 2010-02-19 13:25 oro-2.0.8.jar
-rw-r--r-- 1 hduser hadoop 1312472 2010-02-19 13:25 mockito-all-1.8.0.jar
-rw-r--r-- 1 hduser hadoop  391834 2010-02-19 13:25 log4j-1.2.15.jar
-rw-r--r-- 1 hduser hadoop   11358 2010-02-19 13:25 kfs-0.2.LICENSE.txt
-rw-r--r-- 1 hduser hadoop   11428 2010-02-19 13:25 kfs-0.2.2.jar
-rw-r--r-- 1 hduser hadoop  121070 2010-02-19 13:25 junit-3.8.1.jar
-rw-r--r-- 1 hduser hadoop  163121 2010-02-19 13:25 jetty-util-6.1.14.jar
-rw-r--r-- 1 hduser hadoop  516429 2010-02-19 13:25 jetty-6.1.14.jar
-rw-r--r-- 1 hduser hadoop  321806 2010-02-19 13:25 jets3t-0.6.1.jar
-rw-r--r-- 1 hduser hadoop   76698 2010-02-19 13:25
jasper-runtime-5.5.12.jar
-rw-r--r-- 1 hduser hadoop  405086 2010-02-19 13:25
jasper-compiler-5.5.12.jar
-rw-r--r-- 1 hduser hadoop    3434 2010-02-19 13:25
hsqldb-1.8.0.10.LICENSE.txt
-rw-r--r-- 1 hduser hadoop  706710 2010-02-19 13:25 hsqldb-1.8.0.10.jar
-rw-r--r-- 1 hduser hadoop 3566844 2010-02-19 13:25 core-3.1.1.jar
-rw-r--r-- 1 hduser hadoop  180792 2010-02-19 13:25 commons-net-1.4.1.jar
-rw-r--r-- 1 hduser hadoop   26202 2010-02-19 13:25
commons-logging-api-1.0.4.jar
-rw-r--r-- 1 hduser hadoop   38015 2010-02-19 13:25
commons-logging-1.0.4.jar
-rw-r--r-- 1 hduser hadoop  279781 2010-02-19 13:25
commons-httpclient-3.0.1.jar
-rw-r--r-- 1 hduser hadoop  112341 2010-02-19 13:25 commons-el-1.0.jar
-rw-r--r-- 1 hduser hadoop   46725 2010-02-19 13:25 commons-codec-1.3.jar
-rw-r--r-- 1 hduser hadoop   41123 2010-02-19 13:25 commons-cli-1.2.jar
drwxr-xr-x 2 hduser hadoop    4096 2012-05-04 15:32 jsp-2.1
drwxr-xr-x 4 hduser hadoop    4096 2012-05-04 15:32 native
drwxr-xr-x 2 hduser hadoop    4096 2012-05-04 15:32 jdiff
Thanks,
vijay

On Wed, Jul 4, 2012 at 3:51 PM, alo alt <wg...@gmail.com> wrote:

>
> On Jul 4, 2012, at 12:05 PM, vijay k wrote:
>
> > echo $HADOOP_HOME   --- /hadoop-install/hadoop
> > hadoop classpath        -- /usr/local/hadoop/bin/hadoop
>
> Looks odd to me. Your classpath is empty, the libs are installed in ../lib
>
> - Alex
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>

Re: Flume agent failure

Posted by alo alt <wg...@gmail.com>.
On Jul 4, 2012, at 12:05 PM, vijay k wrote:

> echo $HADOOP_HOME   --- /hadoop-install/hadoop
> hadoop classpath        -- /usr/local/hadoop/bin/hadoop

Looks odd to me. Your classpath is empty, the libs are installed in ../lib

- Alex

--
Alexander Alten-Lorenz
http://mapredit.blogspot.com
German Hadoop LinkedIn Group: http://goo.gl/N8pCF


Re: Flume agent failure

Posted by vijay k <k....@gmail.com>.
Hi Mike,

Please find the following commands output:

which hadoop -- hadoop 0.20.2
echo $HADOOP_HOME   --- /hadoop-install/hadoop
hadoop classpath        -- /usr/local/hadoop/bin/hadoop
which java    --1.6
echo $JAVA_HOME    --- /usr/lib/jvm/java-6-sun-1.6.0.26/jre
java -version
 -- java version "1.6.0_26"
Java(TM) SE Runtime Environment (build 1.6.0_26-b03)
Java HotSpot(TM) Server VM (build 20.1-b02, mixed mode)

flume-ng debug mode output
=====================
root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
bash -x bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
+ FLUME_AGENT_CLASS=org.apache.flume.node.Application
+ FLUME_AVRO_CLIENT_CLASS=org.apache.flume.client.avro.AvroCLIClient
+
FLUME_CLASSPATH=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf
+
FLUME_JAVA_LIBRARY_PATH=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib
+ JAVA_OPTS=-Xmx20m
+ opt_conf=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf
+ opt_classpath=
+ opt_java_props=
+ opt_dryrun=
+ mode=agent
+ shift
+ case "$mode" in
+ opt_agent=1
+ '[' -n '-n agent1 -c conf -f conf/agent1.conf' ']'
+ arg=-n
+ shift
+ case "$arg" in
+ args=' -n'
+ '[' -n 'agent1 -c conf -f conf/agent1.conf' ']'
+ arg=agent1
+ shift
+ case "$arg" in
+ args=' -n agent1'
+ '[' -n '-c conf -f conf/agent1.conf' ']'
+ arg=-c
+ shift
+ case "$arg" in
+ '[' -n conf ']'
+ opt_conf=conf
+ shift
+ '[' -n '-f conf/agent1.conf' ']'
+ arg=-f
+ shift
+ case "$arg" in
+ args=' -n agent1 -f'
+ '[' -n conf/agent1.conf ']'
+ arg=conf/agent1.conf
+ shift
+ case "$arg" in
+ args=' -n agent1 -f conf/agent1.conf'
+ '[' -n '' ']'
+ [[ -n conf ]]
+ [[ -d conf ]]
++ cd conf
++ pwd
+ opt_conf=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf
+ '[' -z /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf ']'
+ '[' -f
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
']'
+ info 'Sourcing environment configuration script
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh'
+ local 'msg=Sourcing environment configuration script
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh'
+ echo 'Info: Sourcing environment configuration script
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh'
Info: Sourcing environment configuration script
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
+ source
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
++ JAVA_HOME=/usr/lib/jvm/java-6-sun-1.6.0.26/jre
++ JAVA_OPTS='-Xms100m -Xmx200m'
++
FLUME_CLASSPATH=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf
++ HADOOP_HOME=/usr/local/hadoop
+ '[' -n '' ']'
+ '[' -n '' ']'
+ '[' -z '' ']'
+++ dirname bin/flume-ng
++ cd bin/..
++ pwd
+ FLUME_HOME=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
+ '[' -n /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf ']'
+
FLUME_CLASSPATH='/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
+ '[' -z /usr/lib/jvm/java-6-sun-1.6.0.26/jre ']'
+ add_hadoop_paths
++
PATH=/usr/local/hadoop/bin:/usr/lib/jvm/java-6-sun/bin:/usr/local/flume_dir/apache-maven-3.0.4/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/hadoop-install/hadoop/bin
++ which hadoop
+ local HADOOP_IN_PATH=/usr/local/hadoop/bin/hadoop
+ '[' -f /usr/local/hadoop/bin/hadoop ']'
+ info 'Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
for HDFS access'
+ local 'msg=Including Hadoop libraries found via
(/usr/local/hadoop/bin/hadoop) for HDFS access'
+ echo 'Info: Including Hadoop libraries found via
(/usr/local/hadoop/bin/hadoop) for HDFS access'
Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
for HDFS access
+ local HADOOP_CLASSPATH=/usr/local/hadoop/bin/hadoop
++
HADOOP_CLASSPATH='/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
++ /usr/local/hadoop/bin/hadoop org.apache.flume.tools.GetJavaProperty
java.library.path
+ local
HADOOP_JAVA_LIBRARY_PATH=java.library.path=/usr/local/hadoop/bin/../lib/native/Linux-i386-32
+ IFS='
'
+ for line in '$HADOOP_JAVA_LIBRARY_PATH'
+ [[ java.library.path=/usr/local/hadoop/bin/../lib/native/Linux-i386-32 =~
^java\.library\.path=(.*)$ ]]
+ HADOOP_JAVA_LIBRARY_PATH=/usr/local/hadoop/bin/../lib/native/Linux-i386-32
+ break
+ unset IFS
+ '[' -n /usr/local/hadoop/bin/../lib/native/Linux-i386-32 ']'
+
FLUME_JAVA_LIBRARY_PATH=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
++ /usr/local/hadoop/bin/hadoop classpath
+ HADOOP_CLASSPATH=
++ sed -e 's/:/ /g'
+ local ELEMENTS=
+ local ELEMENT
+ add_HBASE_paths
++
PATH=/bin:/usr/lib/jvm/java-6-sun/bin:/usr/local/flume_dir/apache-maven-3.0.4/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/hadoop-install/hadoop/bin
++ which hbase
+ local HBASE_IN_PATH=
+ '[' -f '' ']'
+ '[' -n /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf ']'
+
FLUME_CLASSPATH='/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
+ EXEC=exec
+ '[' -n '' ']'
+ '[' -n 1 ']'
+ run_flume org.apache.flume.node.Application -n agent1 -f conf/agent1.conf
+ local FLUME_APPLICATION_CLASS
+ '[' 5 -gt 0 ']'
+ FLUME_APPLICATION_CLASS=org.apache.flume.node.Application
+ shift
+ set -x
+ exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m -Xmx200m -cp
'/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
-Djava.library.path=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
org.apache.flume.node.Application -n agent1 -f conf/agent1.conf


flume.log
==================
root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
more flume.log
2012-07-04 14:58:30,690 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 1
2012-07-04 14:58:30,691 INFO node.FlumeNode: Flume node starting - agent1
2012-07-04 14:58:30,694 INFO nodemanager.DefaultLogicalNodeManager: Node
manager starting
2012-07-04 14:58:30,694 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 10
2012-07-04 14:58:30,694 INFO
properties.PropertiesFileConfigurationProvider: Configuration provider
starting
2012-07-04 14:58:30,696 INFO
properties.PropertiesFileConfigurationProvider: Reloading configuration
file:conf/agent1.conf
2012-07-04 14:58:30,702 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-04 14:58:30,703 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-04 14:58:30,703 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-04 14:58:30,703 INFO conf.FlumeConfiguration: Processing:HDFS
2012-07-04 14:58:30,703 INFO conf.FlumeConfiguration: Added sinks: HDFS
Agent: agent1
2012-07-04 14:58:30,718 INFO conf.FlumeConfiguration: Post-validation flume
configuration contains configuration  for agents: [agent1]
2012-07-04 14:58:30,718 INFO
properties.PropertiesFileConfigurationProvider: Creating channels
2012-07-04 14:58:30,722 INFO
properties.PropertiesFileConfigurationProvider: created channel
MemoryChannel-2
2012-07-04 14:58:30,730 INFO sink.DefaultSinkFactory: Creating instance of
sink HDFS typehdfs
Kindly let me know where i am doing wrong?

Thanks,
Vijay

On Tue, Jul 3, 2012 at 10:14 PM, Mike Percy <mp...@cloudera.com> wrote:

> On Mon, Jul 2, 2012 at 12:11 AM, vijay k <k....@gmail.com> wrote:
>
>>
>>
>> root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
>> bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
>> Info: Sourcing environment configuration script
>> /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
>> Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
>> for HDFS access
>> + exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m -Xmx200m
>> -cp
>> '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
>> -Djava.library.path=:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
>> org.apache.flume.node.Application -n agent1 -f conf/agent1.conf
>>
>
> This is really strange. Your java.library.path is set to
> /usr/local/hadoop/lib/native/Linux-i386-32 but for some reason the output
> of "hadoop classpath" has not made it into your classpath. I've never seen
> that combination before - usually both are there or both are missing.
> That's because the same hadoop binary is used for both cases. Are you on a
> 32-bit OS? I don't know if anyone has ever tested Flume on a 32-bit OS.
>
> If you can post the output of the following 6 commands, it might give us
> more to go on:
>
> which hadoop
> echo $HADOOP_HOME
> hadoop classpath
> which java
> echo $JAVA_HOME
> java -version
>
> If you are fluent in shell scripting, you can also trace the bin/flume-ng
> script and see where it's going wrong. Or just invoke it as:
>
>   bash -x bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
>
> And post the output of that as well (it will be quite long).
>
> Regards,
> Mike
>
>

Re: Flume agent failure

Posted by ankul rastogi <an...@gmail.com>.
Hi,

I had the same problem.
The quick solution that worked for me was to copy the *hadoop-core-1.0.3.jar
* to <flume-installation>/lib folder*. *You can find this jar in your
hadoop installation directory.

On Tue, Jul 3, 2012 at 10:14 PM, Mike Percy <mp...@cloudera.com> wrote:

> On Mon, Jul 2, 2012 at 12:11 AM, vijay k <k....@gmail.com> wrote:
>
>>
>>
>> root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
>> bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
>> Info: Sourcing environment configuration script
>> /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
>> Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
>> for HDFS access
>> + exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m -Xmx200m
>> -cp
>> '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
>> -Djava.library.path=:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
>> org.apache.flume.node.Application -n agent1 -f conf/agent1.conf
>>
>
> This is really strange. Your java.library.path is set to
> /usr/local/hadoop/lib/native/Linux-i386-32 but for some reason the output
> of "hadoop classpath" has not made it into your classpath. I've never seen
> that combination before - usually both are there or both are missing.
> That's because the same hadoop binary is used for both cases. Are you on a
> 32-bit OS? I don't know if anyone has ever tested Flume on a 32-bit OS.
>
> If you can post the output of the following 6 commands, it might give us
> more to go on:
>
> which hadoop
> echo $HADOOP_HOME
> hadoop classpath
> which java
> echo $JAVA_HOME
> java -version
>
> If you are fluent in shell scripting, you can also trace the bin/flume-ng
> script and see where it's going wrong. Or just invoke it as:
>
>   bash -x bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
>
> And post the output of that as well (it will be quite long).
>
> Regards,
> Mike
>
>

Re: Flume agent failure

Posted by Mike Percy <mp...@cloudera.com>.
On Mon, Jul 2, 2012 at 12:11 AM, vijay k <k....@gmail.com> wrote:

>
> root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT#
> bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf
> Info: Sourcing environment configuration script
> /usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf/flume-env.sh
> Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop)
> for HDFS access
> + exec /usr/lib/jvm/java-6-sun-1.6.0.26/jre/bin/java -Xms100m -Xmx200m -cp
> '/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf'
> -Djava.library.path=:/usr/local/hadoop/bin/../lib/native/Linux-i386-32
> org.apache.flume.node.Application -n agent1 -f conf/agent1.conf
>

This is really strange. Your java.library.path is set to
/usr/local/hadoop/lib/native/Linux-i386-32 but for some reason the output
of "hadoop classpath" has not made it into your classpath. I've never seen
that combination before - usually both are there or both are missing.
That's because the same hadoop binary is used for both cases. Are you on a
32-bit OS? I don't know if anyone has ever tested Flume on a 32-bit OS.

If you can post the output of the following 6 commands, it might give us
more to go on:

which hadoop
echo $HADOOP_HOME
hadoop classpath
which java
echo $JAVA_HOME
java -version

If you are fluent in shell scripting, you can also trace the bin/flume-ng
script and see where it's going wrong. Or just invoke it as:

  bash -x bin/flume-ng agent -n agent1 -c conf -f conf/agent1.conf

And post the output of that as well (it will be quite long).

Regards,
Mike