You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Alex Luya <al...@gmail.com> on 2010/07/31 10:38:55 UTC
How to get lzo compression library loaded?
Hello:
I have followed this link:http://code.google.com/p/hadoop-gpl-
compression/wiki/FAQ to install lzo compression library,and copy
hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib,and all files under
..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
and run example,but got this errors:
----------------------------------------------------------------------------------------------------------------
Exception in thread "main" java.lang.IllegalArgumentException: Compression
codec
org.apache.hadoop.io.compress.GzipCodec not found.
at
org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
at
org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
at com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
at
com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117)
at
com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.io.compress.GzipCodec
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:762)
at
org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 11 more
----------------------------------------------------------------------------------------------------------------
and then I try to add this to hadoop-env.sh:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
Same problem as before,this problem is killing me,because it has surrounded me
for one month.
Re: How to get lzo compression library loaded?
Posted by Alex Luya <al...@gmail.com>.
Hi,
the output " ps -aef|grep gpl" is:
------------------------------------------------------------------------- -------------------------------------------------------------------------
alex 2267 1 1 22:04 pts/1 00:00:04 /usr/local/hadoop/jdk1.6.0_21/bin/java -Xmx200m -Dcom.sun.management.jmxremote
-..............................................
/usr/local/hadoop/hadoop-0.20.2/bin/../conf:/usr/local/hadoop/jdk1.6.0_21/lib/tools.jar:/usr/local/hadoop/hadoop-0.20.2/bin/..:/usr/local/hadoop/hadoop-0.20.2/bin/../hadoop-0.20.2-
core.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/commons-cli-1.2.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/commons-
codec-1.3.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/commons-el-1.0.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/commons-.-
net-1.4.1.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/core-3.1.1.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/hadoop-gpl-compression-0.2.0-
dev.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jasper-
compiler-5.5.12.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jasper-
runtime-5.5.12.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jetty-6.1.14.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jetty-
.......................................-
log4j12-1.4.3.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/xmlenc-0.52.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/hadoop-0.20.2/bin/../lib/jsp-2.1/jsp-
api-2.1.jar org.apache.hadoop.hdfs.server.namenode.NameNode
See,two jars are there,at every beginning,I run:hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+' successfully,but when run:
nutch crawl url -dir crawl -depth 3,got errors:
------------------------------------------------------------------------- -------------------------------------------------------------------------
10/08/07 22:53:30 INFO crawl.Crawl: crawl started in: crawl
.....................................................................
10/08/07 22:53:30 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Exception in thread "main" java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
.....................................................................
at org.apache.nutch.crawl.Crawl.main(Crawl.java:124)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
.....................................................................
... 9 more
Caused by: java.lang.IllegalArgumentException: Compression codec
org.apache.hadoop.io.compress.GzipCodec not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
.....................................................................
... 14 more
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.io.compress.GzipCodec
.....................................................................
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 16 more
------------------------------------------------------------------------- -------------------------------------------------------------------------
So,here GzipCode didn't get loaded successfully,or maybe it will not be loaded by default,I don't know,then I followed this
link:http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ to install lzo and run:nutch crawl url -dir crawl -depth 3
------------------------------------------------------------------------- -------------------------------------------------------------------------
10/08/07 22:40:41 INFO crawl.Crawl: crawl started in: crawl
.....................................................................
10/08/07 22:40:42 INFO crawl.Injector: Injector: Converting injected urls to crawl db entries.
10/08/07 22:40:42 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Exception in thread "main" java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
.....................................................................
at org.apache.nutch.crawl.Injector.inject(Injector.java:211)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:124)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
.....................................................................
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 9 more
Caused by: java.lang.IllegalArgumentException: Compression codec
org.apache.hadoop.io.compress.GzipCodec not found.
.....................................................................
at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:41)
... 14 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.compress.GzipCodec
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
.....................................................................
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 16 more
------------------------------------------------------------------------- -------------------------------------------------------------------------
run:hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+',got errors:
------------------------------------------------------------------------- -------------------------------------------------------------------------
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.JobConf.getInputFormat(JobConf.java:400)
.....................................................................
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
.....................................................................
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 22 more
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec
not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:41)
... 27 more
Caused by: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzoCodec
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
.....................................................................
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 29 more
------------------------------------------------------------------------- -------------------------------------------------------------------------
After one month,I haven't solved this problem,it is killing me,here I post my all configure files,would you please help me dig problem out?Thank you,
core-site.xml
------------------------------------------------------------------------- -------------------------------------------------------------------------
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://AlexLuya:8020</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/alex/tmp</value>
</property>
<property>
<name>io.compression.codecs</name>
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.BZip2Codec,com.hadoop.compression.lzo.LzoCodec
</value>
</property>
<property>
<name>io.compression.codec.lzo.class</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>
</configuration>
------------------------------------------------------------------------- -------------------------------------------------------------------------
mapreduce.xml
------------------------------------------------------------------------- -------------------------------------------------------------------------
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>AlexLuya:9001</value>
</property>
<property>
<name>mapred.tasktracker.reduce.tasks.maximum</name>
<value>1</value>
</property>
<property>
<name>mapred.tasktracker.map.tasks.maximum</name>
<value>1</value>
</property>
<property>
<name>mapred.local.dir</name>
<value>/home/alex/hadoop/mapred/local</value>
</property>
<property>
<name>mapred.system.dir</name>
<value>/tmp/hadoop/mapred/system</value>
</property>
<property>
<name>mapreduce.map.output.compress</name>
<value>true</value>
</property>
<property>
<name>mapreduce.map.output.compress.codec</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>
</configuration>
------------------------------------------------------------------------- -------------------------------------------------------------------------
hadoop-env.sh
------------------------------------------------------------------------- -------------------------------------------------------------------------
# Set Hadoop-specific environment variables here.
# The only required environment variable is JAVA_HOME. All others are
# optional. When running a distributed configuration it is best to
# set JAVA_HOME in this file, so that it is correctly defined on
# remote nodes.
# The java implementation to use. Required.
export JAVA_HOME=/usr/local/hadoop/jdk1.6.0_21
# Extra Java CLASSPATH elements. Optional.
# export HADOOP_CLASSPATH=
# The maximum amount of heap to use, in MB. Default is 1000.
export HADOOP_HEAPSIZE=200
# Extra Java runtime options. Empty by default.
#export HADOOP_OPTS=-server
# Command specific options appended to HADOOP_OPTS when specified
export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_NAMENODE_OPTS"
export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_SECONDARYNAMENODE_OPTS"
export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_DATANODE_OPTS"
export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_BALANCER_OPTS"
export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_JOBTRACKER_OPTS"
# export HADOOP_TASKTRACKER_OPTS=
# The following applies to multiple commands (fs, dfs, fsck, distcp etc)
# export HADOOP_CLIENT_OPTS
# Extra ssh options. Empty by default.
# export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
# Where log files are stored. $HADOOP_HOME/logs by default.
# export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
# File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
# export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
# host:path where hadoop code should be rsync'd from. Unset by default.
# export HADOOP_MASTER=master:/home/$USER/src/hadoop
# Seconds to sleep between slave commands. Unset by default. This
# can be useful in large clusters, where, e.g., slave rsyncs can
# otherwise arrive faster than the master can service them.
# export HADOOP_SLAVE_SLEEP=0.1
# The directory where pid files are stored. /tmp by default.
# export HADOOP_PID_DIR=/var/hadoop/pids
# A string representing this instance of hadoop. $USER by default.
#export HADOOP_IDENT_STRING=$USER
# The scheduling priority for daemon processes. See 'man nice'.
# export HADOOP_NICENESS=10
------------------------------------------------------------------------- -------------------------------------------------------------------------
On Sunday, August 01, 2010 03:35:55 pm Alex Kozlov wrote:
> Hi Alex,
>
> This does not seem to be a problem with LZO. Can you check if
> hadoop-0.20.2-core.jar is on the classpath when you execute your command
> (with 'ps -aef') and that hadoop-0.20.2-core.jar contains
> org.apache.hadoop.io.compress.GzipCodec (with 'jar tvf
> hadoop-0.20.2-core.jar').
>
> Thanks,
>
> Alex K
>
> On Sat, Jul 31, 2010 at 11:27 PM, <st...@yahoo.com> wrote:
> > Did you install liblzo2?
> > I don't see that listed..
> >
> > Take care,
> >
> > -stu
> >
> > -----Original Message-----
> > From: Alex Luya <al...@gmail.com>
> > Date: Sun, 1 Aug 2010 14:15:39
> > To: <hd...@hadoop.apache.org>
> > Reply-To: hdfs-user@hadoop.apache.org
> > Subject: Re: How to get lzo compression library loaded?
> >
> > Hi,
> >
> > what I am trying to run is:
> > -------------------------------------------------------------------------
> > -------------------------------------------------- hadoop jar
> > /usr/local/hadoop/hadoop-0.20.2/lib/hadoop-lzo-0.4.4.jar
> > com.hadoop.compression.lzo.DistributedLzoIndexer target.lzo
> >
> > -------------------------------------------------------------------------
> > --------------------------------------------------
> >
> > env | grep -i hadoop
> >
> > -------------------------------------------------------------------------
> > --------------------------------------------------
> > NUTCH_HOME=/usr/local/hadoop/nutch-1.1
> > HADOOP_HOME=/usr/local/hadoop/hadoop-0.20.2
> > HBASE_HOME=/usr/local/hadoop/hbase-0.20.4
> >
> > PATH=/media/Work/workspace/HDScript:/usr/local/hadoop/nutch-1.1/bin:/medi
> > a/Backup/Hive/hive-0.5.0-
> >
> > dev/bin:/usr/local/hadoop/zookeeper-3.3.1/bin:/usr/local/hadoop/hbase-0.2
> > 0.4/bin:/usr/local/hadoop/nutch-1.1/bin:/opt/hypertable/
> > 0.9.3.4/bin:/usr/local/hadoop/hadoop-0.20.2/bin:/home/alex/jetty-
> >
> > hightide-7.1.5.v20100705/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local
> > /hadoop/jdk1.6.0_20/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin
> > :/sbin:/bin:/usr/games
> > PWD=/usr/local/hadoop/hadoop-0.20.2/conf<http://0.9.3.4/bin:/usr/local/h
> > adoop/hadoop-0.20.2/bin:/home/alex/jetty-%0Ahightide-7.1.5.v20100705/bin:
> > /usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/
> > local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games%0APWD=
> > /usr/local/hadoop/hadoop-0.20.2/conf>
> > JAVA_HOME=/usr/local/hadoop/jdk1.6.0_20
> > ZOOKEEPER_HOME=/usr/local/hadoop/zookeeper-3.3.1
> > OLDPWD=/usr/local/hadoop/hadoop-0.20.2
> >
> >
> > -------------------------------------------------------------------------
> > --------------------------------------------------
> >
> >
> > env | grep -i classpath
> >
> > -------------------------------------------------------------------------
> > -------------------------------------------------- no output
> >
> > -------------------------------------------------------------------------
> > --------------------------------------------------
> >
> >
> > All my operations are:
> > 1, got source code from http://github.com/kevinweil/hadoop-lzo,compiled
> > them successfully,and then
> > 2,,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master
> > and
> > slave
> > 3,Copy all files under directory:../Linux-amd64-64/lib to directory:
> > $HADDOOP_HOME/lib/native/Linux-amd64-64 of each master and slave
> > 4,and upload a file:test.lzo to HDFS
> > 5,then run:hadoop jar $HADOOP_HOME/lib/hadoop-lzo-0.4.4.jar
> > com.hadoop.compression.lzo.DistributedLzoIndexer test.lzo to test
> >
> > is any other configuration needed?
> >
> > On Sunday, August 01, 2010 11:37:19 am Alex Kozlov wrote:
> > > Hi Alex,
> > >
> > > org.apache.hadoop.io.compress.GzipCodec is in the hadoop-core-*.jar.
> > > You don't need to add /usr/local/hadoop/hadoop-0.20.2/lib/ to the
> > > classpath since hadoop shell script does it for you. What is exactly
> > > the command
> >
> > you
> >
> > > are trying to run? Can you also give the output of 'env | grep -i
> >
> > hadoop'
> >
> > > and 'env | grep -i classpath'.
> > >
> > > Alex K
> > >
> > > On Sat, Jul 31, 2010 at 1:38 AM, Alex Luya <al...@gmail.com>
> >
> > wrote:
> > > > Hello:
> > > > I have followed this link:http://code.google.com/p/hadoop-gpl-
> > > >
> > > > compression/wiki/FAQ<
> >
> > http://code.google.com/p/hadoop-gpl-%0Acompression/w
> >
> > > > iki/FAQ>to install lzo compression library,and copy
> >
> > hadoop-lzo-0.4.4.jar
> >
> > > > to $HADOOP_HOME/lib,and all files under
> > > > ..lib/native/Linux-amd64-64 to
> > > > $HADOOP_HOME/lib/native/Linux-amd64-64,
> >
> > > > and run example,but got this errors:
> > -------------------------------------------------------------------------
> >
> > > > --------------------------------------- Exception in thread "main"
> > > > java.lang.IllegalArgumentException: Compression codec
> > > >
> > > > org.apache.hadoop.io.compress.GzipCodec not found.
> > > >
> > > > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> >
> > > > pressionCodecFactory.java:96)
> > > >
> > > > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionC
> >
> > > > odecFactory.java:134)
> > > >
> > > > at
> > > >
> > > > com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> > > >
> > > > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117
> >
> > > > )
> > > >
> > > > at
> > > >
> > > > com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:9
> > > > 8)
> > > >
> > > > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> >
> > > > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> >
> > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > > at
> >
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> >
> > > > :39)
> > > > :
> > > > at
> >
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> >
> > > > mpl.java:25)
> > > >
> > > > at java.lang.reflect.Method.invoke(Method.java:597)
> > > > at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > > >
> > > > Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.io.compress.GzipCodec
> > > >
> > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > > > at java.lang.Class.forName0(Native Method)
> > > > at java.lang.Class.forName(Class.java:247)
> > > > at
> >
> > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:76
> >
> > > > 2)
> > > >
> > > > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> >
> > > > pressionCodecFactory.java:89)
> > > >
> > > > ... 11 more
> >
> > -------------------------------------------------------------------------
> >
> > > > ---------------------------------------
> > > >
> > > > and then I try to add this to hadoop-env.sh:
> > > >
> > > > export
> > > > HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/li
> > > > b/
> > > >
> > > > Same problem as before,this problem is killing me,because it has
> > > > surrounded me
> > > > for one month.
Re: How to get lzo compression library loaded?
Posted by Alex Kozlov <al...@cloudera.com>.
Hi Alex,
This does not seem to be a problem with LZO. Can you check if
hadoop-0.20.2-core.jar is on the classpath when you execute your command
(with 'ps -aef') and that hadoop-0.20.2-core.jar contains
org.apache.hadoop.io.compress.GzipCodec (with 'jar tvf
hadoop-0.20.2-core.jar').
Thanks,
Alex K
On Sat, Jul 31, 2010 at 11:27 PM, <st...@yahoo.com> wrote:
> Did you install liblzo2?
> I don't see that listed..
>
> Take care,
> -stu
> -----Original Message-----
> From: Alex Luya <al...@gmail.com>
> Date: Sun, 1 Aug 2010 14:15:39
> To: <hd...@hadoop.apache.org>
> Reply-To: hdfs-user@hadoop.apache.org
> Subject: Re: How to get lzo compression library loaded?
>
> Hi,
> what I am trying to run is:
>
> ---------------------------------------------------------------------------------------------------------------------------
> hadoop jar /usr/local/hadoop/hadoop-0.20.2/lib/hadoop-lzo-0.4.4.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer target.lzo
>
> ---------------------------------------------------------------------------------------------------------------------------
>
> env | grep -i hadoop
>
> ---------------------------------------------------------------------------------------------------------------------------
> NUTCH_HOME=/usr/local/hadoop/nutch-1.1
> HADOOP_HOME=/usr/local/hadoop/hadoop-0.20.2
> HBASE_HOME=/usr/local/hadoop/hbase-0.20.4
>
> PATH=/media/Work/workspace/HDScript:/usr/local/hadoop/nutch-1.1/bin:/media/Backup/Hive/hive-0.5.0-
>
> dev/bin:/usr/local/hadoop/zookeeper-3.3.1/bin:/usr/local/hadoop/hbase-0.20.4/bin:/usr/local/hadoop/nutch-1.1/bin:/opt/hypertable/
> 0.9.3.4/bin:/usr/local/hadoop/hadoop-0.20.2/bin:/home/alex/jetty-
>
> hightide-7.1.5.v20100705/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
> PWD=/usr/local/hadoop/hadoop-0.20.2/conf<http://0.9.3.4/bin:/usr/local/hadoop/hadoop-0.20.2/bin:/home/alex/jetty-%0Ahightide-7.1.5.v20100705/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games%0APWD=/usr/local/hadoop/hadoop-0.20.2/conf>
> JAVA_HOME=/usr/local/hadoop/jdk1.6.0_20
> ZOOKEEPER_HOME=/usr/local/hadoop/zookeeper-3.3.1
> OLDPWD=/usr/local/hadoop/hadoop-0.20.2
>
>
> ---------------------------------------------------------------------------------------------------------------------------
>
>
> env | grep -i classpath
>
> ---------------------------------------------------------------------------------------------------------------------------
> no output
>
> ---------------------------------------------------------------------------------------------------------------------------
>
>
> All my operations are:
> 1, got source code from http://github.com/kevinweil/hadoop-lzo,compiled
> them successfully,and then
> 2,,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master
> and
> slave
> 3,Copy all files under directory:../Linux-amd64-64/lib to directory:
> $HADDOOP_HOME/lib/native/Linux-amd64-64 of each master and slave
> 4,and upload a file:test.lzo to HDFS
> 5,then run:hadoop jar $HADOOP_HOME/lib/hadoop-lzo-0.4.4.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer test.lzo to test
>
> is any other configuration needed?
>
> On Sunday, August 01, 2010 11:37:19 am Alex Kozlov wrote:
> > Hi Alex,
> >
> > org.apache.hadoop.io.compress.GzipCodec is in the hadoop-core-*.jar. You
> > don't need to add /usr/local/hadoop/hadoop-0.20.2/lib/ to the classpath
> > since hadoop shell script does it for you. What is exactly the command
> you
> > are trying to run? Can you also give the output of 'env | grep -i
> hadoop'
> > and 'env | grep -i classpath'.
> >
> > Alex K
> >
> > On Sat, Jul 31, 2010 at 1:38 AM, Alex Luya <al...@gmail.com>
> wrote:
> > > Hello:
> > > I have followed this link:http://code.google.com/p/hadoop-gpl-
> > >
> > > compression/wiki/FAQ<
> http://code.google.com/p/hadoop-gpl-%0Acompression/w
> > > iki/FAQ>to install lzo compression library,and copy
> hadoop-lzo-0.4.4.jar
> > > to $HADOOP_HOME/lib,and all files under
> > > ..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
> > > and run example,but got this errors:
> > >
> > >
> -------------------------------------------------------------------------
> > > --------------------------------------- Exception in thread "main"
> > > java.lang.IllegalArgumentException: Compression codec
> > >
> > > org.apache.hadoop.io.compress.GzipCodec not found.
> > >
> > > at
> > >
> > >
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> > > pressionCodecFactory.java:96)
> > >
> > > at
> > >
> > >
> org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionC
> > > odecFactory.java:134)
> > >
> > > at
> > >
> > > com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> > >
> > > at
> > >
> > >
> com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117
> > > )
> > >
> > > at
> > >
> > > com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
> > >
> > > at
> com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> > > at
> com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> > > :39)
> > >
> > > at
> > >
> > >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> > > mpl.java:25)
> > >
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > > at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > >
> > > Caused by: java.lang.ClassNotFoundException:
> > > org.apache.hadoop.io.compress.GzipCodec
> > >
> > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > > at java.lang.Class.forName0(Native Method)
> > > at java.lang.Class.forName(Class.java:247)
> > > at
> > >
> > >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:76
> > > 2)
> > >
> > > at
> > >
> > >
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> > > pressionCodecFactory.java:89)
> > >
> > > ... 11 more
> > >
> > >
> -------------------------------------------------------------------------
> > > ---------------------------------------
> > >
> > > and then I try to add this to hadoop-env.sh:
> > >
> > > export
> > > HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
> > >
> > > Same problem as before,this problem is killing me,because it has
> > > surrounded me
> > > for one month.
>
Re: How to get lzo compression library loaded?
Posted by st...@yahoo.com.
Did you install liblzo2?
I don't see that listed..
Take care,
-stu
-----Original Message-----
From: Alex Luya <al...@gmail.com>
Date: Sun, 1 Aug 2010 14:15:39
To: <hd...@hadoop.apache.org>
Reply-To: hdfs-user@hadoop.apache.org
Subject: Re: How to get lzo compression library loaded?
Hi,
what I am trying to run is:
---------------------------------------------------------------------------------------------------------------------------
hadoop jar /usr/local/hadoop/hadoop-0.20.2/lib/hadoop-lzo-0.4.4.jar
com.hadoop.compression.lzo.DistributedLzoIndexer target.lzo
---------------------------------------------------------------------------------------------------------------------------
env | grep -i hadoop
---------------------------------------------------------------------------------------------------------------------------
NUTCH_HOME=/usr/local/hadoop/nutch-1.1
HADOOP_HOME=/usr/local/hadoop/hadoop-0.20.2
HBASE_HOME=/usr/local/hadoop/hbase-0.20.4
PATH=/media/Work/workspace/HDScript:/usr/local/hadoop/nutch-1.1/bin:/media/Backup/Hive/hive-0.5.0-
dev/bin:/usr/local/hadoop/zookeeper-3.3.1/bin:/usr/local/hadoop/hbase-0.20.4/bin:/usr/local/hadoop/nutch-1.1/bin:/opt/hypertable/0.9.3.4/bin:/usr/local/hadoop/hadoop-0.20.2/bin:/home/alex/jetty-
hightide-7.1.5.v20100705/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
PWD=/usr/local/hadoop/hadoop-0.20.2/conf
JAVA_HOME=/usr/local/hadoop/jdk1.6.0_20
ZOOKEEPER_HOME=/usr/local/hadoop/zookeeper-3.3.1
OLDPWD=/usr/local/hadoop/hadoop-0.20.2
---------------------------------------------------------------------------------------------------------------------------
env | grep -i classpath
---------------------------------------------------------------------------------------------------------------------------
no output
---------------------------------------------------------------------------------------------------------------------------
All my operations are:
1, got source code from http://github.com/kevinweil/hadoop-lzo,compiled
them successfully,and then
2,,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master and
slave
3,Copy all files under directory:../Linux-amd64-64/lib to directory:
$HADDOOP_HOME/lib/native/Linux-amd64-64 of each master and slave
4,and upload a file:test.lzo to HDFS
5,then run:hadoop jar $HADOOP_HOME/lib/hadoop-lzo-0.4.4.jar
com.hadoop.compression.lzo.DistributedLzoIndexer test.lzo to test
is any other configuration needed?
On Sunday, August 01, 2010 11:37:19 am Alex Kozlov wrote:
> Hi Alex,
>
> org.apache.hadoop.io.compress.GzipCodec is in the hadoop-core-*.jar. You
> don't need to add /usr/local/hadoop/hadoop-0.20.2/lib/ to the classpath
> since hadoop shell script does it for you. What is exactly the command you
> are trying to run? Can you also give the output of 'env | grep -i hadoop'
> and 'env | grep -i classpath'.
>
> Alex K
>
> On Sat, Jul 31, 2010 at 1:38 AM, Alex Luya <al...@gmail.com> wrote:
> > Hello:
> > I have followed this link:http://code.google.com/p/hadoop-gpl-
> >
> > compression/wiki/FAQ<http://code.google.com/p/hadoop-gpl-%0Acompression/w
> > iki/FAQ>to install lzo compression library,and copy hadoop-lzo-0.4.4.jar
> > to $HADOOP_HOME/lib,and all files under
> > ..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
> > and run example,but got this errors:
> >
> > -------------------------------------------------------------------------
> > --------------------------------------- Exception in thread "main"
> > java.lang.IllegalArgumentException: Compression codec
> >
> > org.apache.hadoop.io.compress.GzipCodec not found.
> >
> > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> > pressionCodecFactory.java:96)
> >
> > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionC
> > odecFactory.java:134)
> >
> > at
> >
> > com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> >
> > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117
> > )
> >
> > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
> >
> > at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> > at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> > :39)
> >
> > at
> >
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> > mpl.java:25)
> >
> > at java.lang.reflect.Method.invoke(Method.java:597)
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.io.compress.GzipCodec
> >
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > at java.lang.Class.forName0(Native Method)
> > at java.lang.Class.forName(Class.java:247)
> > at
> >
> > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:76
> > 2)
> >
> > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> > pressionCodecFactory.java:89)
> >
> > ... 11 more
> >
> > -------------------------------------------------------------------------
> > ---------------------------------------
> >
> > and then I try to add this to hadoop-env.sh:
> >
> > export
> > HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
> >
> > Same problem as before,this problem is killing me,because it has
> > surrounded me
> > for one month.
Re: How to get lzo compression library loaded?
Posted by Alex Luya <al...@gmail.com>.
Hi,
what I am trying to run is:
---------------------------------------------------------------------------------------------------------------------------
hadoop jar /usr/local/hadoop/hadoop-0.20.2/lib/hadoop-lzo-0.4.4.jar
com.hadoop.compression.lzo.DistributedLzoIndexer target.lzo
---------------------------------------------------------------------------------------------------------------------------
env | grep -i hadoop
---------------------------------------------------------------------------------------------------------------------------
NUTCH_HOME=/usr/local/hadoop/nutch-1.1
HADOOP_HOME=/usr/local/hadoop/hadoop-0.20.2
HBASE_HOME=/usr/local/hadoop/hbase-0.20.4
PATH=/media/Work/workspace/HDScript:/usr/local/hadoop/nutch-1.1/bin:/media/Backup/Hive/hive-0.5.0-
dev/bin:/usr/local/hadoop/zookeeper-3.3.1/bin:/usr/local/hadoop/hbase-0.20.4/bin:/usr/local/hadoop/nutch-1.1/bin:/opt/hypertable/0.9.3.4/bin:/usr/local/hadoop/hadoop-0.20.2/bin:/home/alex/jetty-
hightide-7.1.5.v20100705/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/hadoop/jdk1.6.0_20/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
PWD=/usr/local/hadoop/hadoop-0.20.2/conf
JAVA_HOME=/usr/local/hadoop/jdk1.6.0_20
ZOOKEEPER_HOME=/usr/local/hadoop/zookeeper-3.3.1
OLDPWD=/usr/local/hadoop/hadoop-0.20.2
---------------------------------------------------------------------------------------------------------------------------
env | grep -i classpath
---------------------------------------------------------------------------------------------------------------------------
no output
---------------------------------------------------------------------------------------------------------------------------
All my operations are:
1, got source code from http://github.com/kevinweil/hadoop-lzo,compiled
them successfully,and then
2,,copy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each master and
slave
3,Copy all files under directory:../Linux-amd64-64/lib to directory:
$HADDOOP_HOME/lib/native/Linux-amd64-64 of each master and slave
4,and upload a file:test.lzo to HDFS
5,then run:hadoop jar $HADOOP_HOME/lib/hadoop-lzo-0.4.4.jar
com.hadoop.compression.lzo.DistributedLzoIndexer test.lzo to test
is any other configuration needed?
On Sunday, August 01, 2010 11:37:19 am Alex Kozlov wrote:
> Hi Alex,
>
> org.apache.hadoop.io.compress.GzipCodec is in the hadoop-core-*.jar. You
> don't need to add /usr/local/hadoop/hadoop-0.20.2/lib/ to the classpath
> since hadoop shell script does it for you. What is exactly the command you
> are trying to run? Can you also give the output of 'env | grep -i hadoop'
> and 'env | grep -i classpath'.
>
> Alex K
>
> On Sat, Jul 31, 2010 at 1:38 AM, Alex Luya <al...@gmail.com> wrote:
> > Hello:
> > I have followed this link:http://code.google.com/p/hadoop-gpl-
> >
> > compression/wiki/FAQ<http://code.google.com/p/hadoop-gpl-%0Acompression/w
> > iki/FAQ>to install lzo compression library,and copy hadoop-lzo-0.4.4.jar
> > to $HADOOP_HOME/lib,and all files under
> > ..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
> > and run example,but got this errors:
> >
> > -------------------------------------------------------------------------
> > --------------------------------------- Exception in thread "main"
> > java.lang.IllegalArgumentException: Compression codec
> >
> > org.apache.hadoop.io.compress.GzipCodec not found.
> >
> > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> > pressionCodecFactory.java:96)
> >
> > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionC
> > odecFactory.java:134)
> >
> > at
> >
> > com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> >
> > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117
> > )
> >
> > at
> >
> > com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
> >
> > at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> > at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> > :39)
> >
> > at
> >
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> > mpl.java:25)
> >
> > at java.lang.reflect.Method.invoke(Method.java:597)
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.io.compress.GzipCodec
> >
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > at java.lang.Class.forName0(Native Method)
> > at java.lang.Class.forName(Class.java:247)
> > at
> >
> > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:76
> > 2)
> >
> > at
> >
> > org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Com
> > pressionCodecFactory.java:89)
> >
> > ... 11 more
> >
> > -------------------------------------------------------------------------
> > ---------------------------------------
> >
> > and then I try to add this to hadoop-env.sh:
> >
> > export
> > HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
> >
> > Same problem as before,this problem is killing me,because it has
> > surrounded me
> > for one month.
Re: How to get lzo compression library loaded?
Posted by Alex Kozlov <al...@cloudera.com>.
Hi Alex,
org.apache.hadoop.io.compress.GzipCodec is in the hadoop-core-*.jar. You
don't need to add /usr/local/hadoop/hadoop-0.20.2/lib/ to the classpath
since hadoop shell script does it for you. What is exactly the command you
are trying to run? Can you also give the output of 'env | grep -i hadoop'
and 'env | grep -i classpath'.
Alex K
On Sat, Jul 31, 2010 at 1:38 AM, Alex Luya <al...@gmail.com> wrote:
> Hello:
> I have followed this link:http://code.google.com/p/hadoop-gpl-
> compression/wiki/FAQ<http://code.google.com/p/hadoop-gpl-%0Acompression/wiki/FAQ>to install lzo compression library,and copy
> hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib,and all files under
> ..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
> and run example,but got this errors:
>
> ----------------------------------------------------------------------------------------------------------------
> Exception in thread "main" java.lang.IllegalArgumentException: Compression
> codec
> org.apache.hadoop.io.compress.GzipCodec not found.
> at
>
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
> at
>
> org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
> at
> com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> at
> com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117)
> at
> com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
> at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.io.compress.GzipCodec
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:247)
> at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:762)
> at
>
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
> ... 11 more
>
>
>
> ----------------------------------------------------------------------------------------------------------------
>
> and then I try to add this to hadoop-env.sh:
>
> export
> HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
>
> Same problem as before,this problem is killing me,because it has surrounded
> me
> for one month.
>
>
>
Re: How to get lzo compression library loaded?
Posted by Alex Kozlov <al...@cloudera.com>.
Hi Alex,
org.apache.hadoop.io.compress.GzipCodec is in the hadoop-core-*.jar. You
don't need to add /usr/local/hadoop/hadoop-0.20.2/lib/ to the classpath
since hadoop shell script does it for you. What is exactly the command you
are trying to run? Can you also give the output of 'env | grep -i hadoop'
and 'env | grep -i classpath'.
Alex K
On Sat, Jul 31, 2010 at 1:38 AM, Alex Luya <al...@gmail.com> wrote:
> Hello:
> I have followed this link:http://code.google.com/p/hadoop-gpl-
> compression/wiki/FAQ<http://code.google.com/p/hadoop-gpl-%0Acompression/wiki/FAQ>to install lzo compression library,and copy
> hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib,and all files under
> ..lib/native/Linux-amd64-64 to $HADOOP_HOME/lib/native/Linux-amd64-64,
> and run example,but got this errors:
>
> ----------------------------------------------------------------------------------------------------------------
> Exception in thread "main" java.lang.IllegalArgumentException: Compression
> codec
> org.apache.hadoop.io.compress.GzipCodec not found.
> at
>
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
> at
>
> org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
> at
> com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> at
> com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117)
> at
> com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
> at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.io.compress.GzipCodec
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:247)
> at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:762)
> at
>
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
> ... 11 more
>
>
>
> ----------------------------------------------------------------------------------------------------------------
>
> and then I try to add this to hadoop-env.sh:
>
> export
> HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
>
> Same problem as before,this problem is killing me,because it has surrounded
> me
> for one month.
>
>
>
Re: How to get lzo compression library loaded?
Posted by Stuart Smith <st...@yahoo.com>.
Hello Alex,
Here's some simple mistakes I made yesterday, maybe the cause is the same?
Worth a try.
Have you:
- installed the native *C* library? i.e, not the files you download on the google page, but liblzo2. On Ubuntu this is:
$> apt-get install lzop
According the gplcompression faq:
Note that you must have both 32-bit and 64-bit liblzo2 installed. This is how it looks like on my RedHat build machine:
% ls -l /usr/lib*/liblzo2*
-rw-r--r-- 1 root root 171056 Mar 20 2006 /usr/lib/liblzo2.a
lrwxrwxrwx 1 root root 16 Feb 17 2007 /usr/lib/liblzo2.so -> liblzo2.so.2.0.0*
lrwxrwxrwx 1 root root 16 Feb 17 2007 /usr/lib/liblzo2.so.2 -> liblzo2.so.2.0.0*
-rwxr-xr-x 1 root root 129067 Mar 20 2006 /usr/lib/liblzo2.so.2.0.0*
-rw-r--r-- 1 root root 208494 Mar 20 2006 /usr/lib64/liblzo2.a
lrwxrwxrwx 1 root root 16 Feb 17 2007 /usr/lib64/liblzo2.so -> liblzo2.so.2.0.0*
lrwxrwxrwx 1 root root 16 Feb 17 2007 /usr/lib64/liblzo2.so.2 -> liblzo2.so.2.0.0*
-rwxr-xr-x 1 root root 126572 Mar 20 2006 /usr/lib64/liblzo2.so.2.0.0*
This tripped me up a little yesterday - I was rushing to get some new nodes installed, and forgot.
- Also you'll need to restart your regionservers after installing. At least I had to. Only on the nodes where you installed the lib, though, you don't have to restart everything.
Take care,
-stu
--- On Sat, 7/31/10, Alex Luya <al...@gmail.com> wrote:
> From: Alex Luya <al...@gmail.com>
> Subject: How to get lzo compression library loaded?
> To: common-user@hadoop.apache.org, hdfs-user@hadoop.apache.org
> Date: Saturday, July 31, 2010, 4:38 AM
> Hello:
> I have followed this link:http://code.google.com/p/hadoop-gpl-
> compression/wiki/FAQ to install lzo compression library,and
> copy
> hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib,and all files
> under
> ..lib/native/Linux-amd64-64 to
> $HADOOP_HOME/lib/native/Linux-amd64-64,
> and run example,but got this errors:
> ----------------------------------------------------------------------------------------------------------------
> Exception in thread "main"
> java.lang.IllegalArgumentException: Compression
> codec
>
> org.apache.hadoop.io.compress.GzipCodec not found.
> at
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
> at
> org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
> at
> com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:202)
> at
> com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117)
> at
> com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
> at
> com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
> at
> com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at
> java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.ClassNotFoundException:
>
> org.apache.hadoop.io.compress.GzipCodec
> at
> java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at
> java.security.AccessController.doPrivileged(Native Method)
> at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> at
> java.lang.Class.forName0(Native Method)
> at
> java.lang.Class.forName(Class.java:247)
> at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:762)
> at
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
> ... 11 more
>
>
> ----------------------------------------------------------------------------------------------------------------
>
> and then I try to add this to hadoop-env.sh:
>
> export
> HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hadoop/hadoop-0.20.2/lib/
>
>
> Same problem as before,this problem is killing me,because
> it has surrounded me
> for one month.
>
>
>