You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "W.P. McNeill" <bi...@gmail.com> on 2011/09/08 23:48:01 UTC

How do I run all the Hadoop unit tests?

Should I be able to go to the root of a clean 0.20.203.0 download, type "ant
test", and have everything work? I tried this and saw a lot of test
failures. Is there some other configuration setup I have to do?  (For
instance, do I have to have a running cluster?)

Re: How do I run all the Hadoop unit tests?

Posted by "W.P. McNeill" <bi...@gmail.com>.
I never did. Maybe I'll try again with version 1.0.0.

On Thu, Nov 17, 2011 at 1:12 AM, cheersyang <wi...@yahoo.cn> wrote:

> I met the same issue as you, curious to know if you figure out a solution
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/How-do-I-run-all-the-Hadoop-unit-tests-tp3321283p3515176.html
> Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
>

Re: How do I run all the Hadoop unit tests?

Posted by cheersyang <wi...@yahoo.cn>.
I met the same issue as you, curious to know if you figure out a solution 

--
View this message in context: http://lucene.472066.n3.nabble.com/How-do-I-run-all-the-Hadoop-unit-tests-tp3321283p3515176.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

Re: How do I run all the Hadoop unit tests?

Posted by "W.P. McNeill" <bi...@gmail.com>.
It looks like most of the unit test errors I'm seeing are due to these
incorrect permissions on the test DFS directory.

Re: How do I run all the Hadoop unit tests?

Posted by "W.P. McNeill" <bi...@gmail.com>.
Vanilla Hadoop 0.20.203.3 on Ubuntu with JDK 1.6.0. I don't know exactly how
many failures there were, but there were a lot. Just from watching the
output of ant I'd guess that about a quarter of the unit tests were failing.

I'm rerunning with "ant clean test" right now.  Here's a snippet from the
end of the org.apache.hadoop.cli.TestCLI log file:

2011-09-08 14:56:24,413 WARN  datanode.DataNode
(DataNode.java:makeInstance(1475)) - Invalid directory in dfs.data.dir:
Incorrect permission for
/atlas/spock/bmcneill/hadoop-deployment/hadoop/hadoop-0.20.203.0/build/test/data/df\
s/data/data1, expected: rwxr-xr-x, while actual: rwxrwxr-x
2011-09-08 14:56:24,430 WARN  datanode.DataNode
(DataNode.java:makeInstance(1475)) - Invalid directory in dfs.data.dir:
Incorrect permission for
/atlas/spock/bmcneill/hadoop-deployment/hadoop/hadoop-0.20.203.0/build/test/data/df\
s/data/data2, expected: rwxr-xr-x, while actual: rwxrwxr-x
2011-09-08 14:56:24,430 ERROR datanode.DataNode
(DataNode.java:makeInstance(1481)) - All directories in dfs.data.dir are
invalid.

Maybe there's some umask weirdness? I'm not sure if all the errors are of
this nature.

PS. Today my replies to this list keep getting bounced as spam. I'm not sure
why.

Re: How do I run all the Hadoop unit tests?

Posted by Harsh J <ha...@cloudera.com>.
Hey McNeill,

On Fri, Sep 9, 2011 at 3:18 AM, W.P. McNeill <bi...@gmail.com> wrote:
> Should I be able to go to the root of a clean 0.20.203.0 download, type "ant
> test", and have everything work?

I believe "ant clean test" should run everything, and should also
work. How many failures are you seeing? What platform are you running
it on, and against what JVM?

-- 
Harsh J