You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by sudha sadhasivam <su...@yahoo.com> on 2008/07/31 11:59:40 UTC

Hadoop + Biojava - error

Respected Sir
We are ME students working on Hadoop for bioinformatics. 
We have installed Hadoop on Linux and it is working well. 
We have installed biojava (an extension of java for bioinformatics) for DNA sequence searching.
 
To install biojava, we have to copy the jar files biojava-live.jar, commons-cli.jar, bytecode.jar, commons-collections-2.1.jar, commons-pool-1.1.jar in /root/usr/java/jdk1.6.0_06/jre/lib/ext directory.
 
Examples in biojava works well.
 
Now if we run hadoop dfs after installation of biojava the following error is displayed
 
[root@oss8 hadoop-0.17.0]# bin/hadoop dfs -put conf input
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.OptionBuilder.withArgPattern(Ljava/lang/String;I)Lorg/apache/commons/cli/OptionBuilder;
        at org.apache.hadoop.util.GenericOptionsParser.buildGeneralOptions(GenericOptionsParser.java:164)
        at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:213)
        at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:120)
        at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:105)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:1866)
 
If we remove commons-cli.jar (jar file for bio java)  from /root/usr/java/jdk1.6.0_06/jre/lib/ext directory then hadoop examples work.
 
We think that both hadoop and commons-cli.jar files have org.apache in common.
 
As we want to run biojava in hadoop DFS, can you suggest a way to correct this error?
Please reply at the earliest. Thanking you
 
With regards
Geetha Jini, Komagal Meenakshi


      

Re: Hadoop + Biojava - error

Posted by Steve Loughran <st...@apache.org>.
sudha sadhasivam wrote:
> Respected Sir
> We are ME students working on Hadoop for bioinformatics. 
> We have installed Hadoop on Linux and it is working well. 
> We have installed biojava (an extension of java for bioinformatics) for DNA sequence searching.
>  
> To install biojava, we have to copy the jar files biojava-live.jar, commons-cli.jar, bytecode.jar, commons-collections-2.1.jar, commons-pool-1.1.jar in /root/usr/java/jdk1.6.0_06/jre/lib/ext directory.
>  
> Examples in biojava works well.
>  
> Now if we run hadoop dfs after installation of biojava the following error is displayed
>  
> [root@oss8 hadoop-0.17.0]# bin/hadoop dfs -put conf input
> Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.OptionBuilder.withArgPattern(Ljava/lang/String;I)Lorg/apache/commons/cli/OptionBuilder;
>         at org.apache.hadoop.util.GenericOptionsParser.buildGeneralOptions(GenericOptionsParser.java:164)
>         at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:213)
>         at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:120)
>         at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:105)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.fs.FsShell.main(FsShell.java:1866)
>  
> If we remove commons-cli.jar (jar file for bio java)  from /root/usr/java/jdk1.6.0_06/jre/lib/ext directory then hadoop examples work.
>  
> We think that both hadoop and commons-cli.jar files have org.apache in common.

maybe, but clearly different versions. Hadoop currently uses its own 
build of commons-cli-2.0

>  
> As we want to run biojava in hadoop DFS, can you suggest a way to correct this error?

dont stick JARs in the jre/lib or the jre/lib/ext directories; have the 
different programs pick them up themselves however they build their 
classpaths. Hadoop only needs the library for the command line entry 
points; if you have client code that calls hadoop.jar methods directly, 
then all you need to run hadoop are, I believe:
  commons-logging, log4j, commons-httpclient, commons-codec