You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Koert Kuipers <ko...@tresata.com> on 2012/11/10 08:43:49 UTC
MiniMRCluster not behaving in hadoop 1.0.4
i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
unit tests used to run fine but are now misbehaving. i do a simple setup in
the unit test like this:
private static MiniDFSCluster dfsCluster;
private static FileSystem fs;
private static MiniMRCluster mrCluster;
@BeforeClass
public static void oneTimeSetUp() throws Exception {
new File("tmp/logs").mkdirs();
System.setProperty("hadoop.log.dir", "tmp/logs");
System.setProperty("javax.xml.parsers.SAXParserFactory",
"com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
final Configuration conf = new Configuration();
conf.set("io.skip.checksum.errors", "true");
dfsCluster = new MiniDFSCluster(conf, 2, true, null);
fs = dfsCluster.getFileSystem();
mrCluster = new MiniMRCluster(2,
dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
org.apache.hadoop.mapred.JobConf(conf));
}
however with hadoop 1.0.4 this results in an exception where the jobtracker
dies:
12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
at
org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
at
org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
not sure what this error is about, but i managed to get around it for now
by removing the 2 hadoop apache jasper dependencies which seem to have
something to do with it: jasper-compiler-5.5.12.jar and
jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
starts, but now all the tasks fail with messages like this:
attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
am i doing something completely wrong, and was i just lucky that this used
to run with CDH3? any ideas? thanks! koert
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
I don't see anything that obvious, except there are two servlet API JARS
-one of those should go
On 10 November 2012 19:42, Koert Kuipers <ko...@tresata.com> wrote:
> ant sets classpath for me but i have same issues when i set it myself.
> for example i can run it like this:
> java -cp lib/test/*:build/classes:build/test
> com.tresata.hadoop.mapred.MapRedTest
>
> my dependencies are hadoop-core, hadoop-test and i also had to add
> jsr311api to make it work.
> ls lib/test gives:
>
> ant-1.6.5.jar
> commons-beanutils-1.7.0.jar
> commons-beanutils-core-1.8.0.jar
> commons-cli-1.2.jar
> commons-codec-1.4.jar
> commons-collections-3.2.1.jar
> commons-configuration-1.6.jar
> commons-digester-1.8.jar
> commons-el-1.0.jar
> commons-httpclient-3.0.1.jar
> commons-lang-2.4.jar
> commons-logging-1.1.1.jar
> commons-math-2.1.jar
> commons-net-1.4.1.jar
> core-3.1.1.jar
> ftplet-api-1.0.0.jar
> ftpserver-core-1.0.0.jar
> ftpserver-deprecated-1.0.0-M2.jar
> guava-r06.jar
> hadoop-core-1.0.4.jar
> hadoop-test-1.0.4.jar
> hsqldb-1.8.0.10.jar
> jackson-core-asl-1.0.1.jar
> jackson-mapper-asl-1.0.1.jar
> jasper-compiler-5.5.12.jar
> jasper-runtime-5.5.12.jar
> jets3t-0.7.1.jar
> jetty-6.1.26.jar
> jetty-util-6.1.26.jar
> joda-time-2.1.jar
> jsp-2.1-6.1.14.jar
> jsp-api-2.1-6.1.14.jar
> jsr311-api-1.1.1.jar
> junit-3.8.1.jar
> kfs-0.3.jar
> log4j-1.2.16.jar
> mina-core-2.0.0-M5.jar
> oro-2.0.8.jar
> servlet-api-2.5-20081211.jar
> servlet-api-2.5-6.1.14.jar
> slf4j-api-1.6.1.jar
> slf4j-log4j12-1.6.1.jar
> xmlenc-0.52.jar
>
>
> On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>>
>>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>>> the unit test like this:
>>>
>>> private static MiniDFSCluster dfsCluster;
>>> private static FileSystem fs;
>>> private static MiniMRCluster mrCluster;
>>>
>>> @BeforeClass
>>> public static void oneTimeSetUp() throws Exception {
>>> new File("tmp/logs").mkdirs();
>>> System.setProperty("hadoop.log.dir", "tmp/logs");
>>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>>
>>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>>> final Configuration conf = new Configuration();
>>> conf.set("io.skip.checksum.errors", "true");
>>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>>> fs = dfsCluster.getFileSystem();
>>> mrCluster = new MiniMRCluster(2,
>>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>>> org.apache.hadoop.mapred.JobConf(conf));
>>> }
>>>
>>> however with hadoop 1.0.4 this results in an exception where the
>>> jobtracker dies:
>>>
>>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>>> at
>>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>>> at
>>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>>
>>> not sure what this error is about, but i managed to get around it for
>>> now by removing the 2 hadoop apache jasper dependencies which seem to have
>>> something to do with it: jasper-compiler-5.5.12.jar and
>>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>>> starts, but now all the tasks fail with messages like this:
>>>
>>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>>
>>> am i doing something completely wrong, and was i just lucky that this
>>> used to run with CDH3? any ideas? thanks! koert
>>>
>>>
>> This sounds like some classpath/versioning thing -MiniMR cluster works
>> for me; I even fixed that system property setup in trunk as I got fed up
>> with having to do it myself.
>>
>> What is your CP in your test run?
>>
>
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
I don't see anything that obvious, except there are two servlet API JARS
-one of those should go
On 10 November 2012 19:42, Koert Kuipers <ko...@tresata.com> wrote:
> ant sets classpath for me but i have same issues when i set it myself.
> for example i can run it like this:
> java -cp lib/test/*:build/classes:build/test
> com.tresata.hadoop.mapred.MapRedTest
>
> my dependencies are hadoop-core, hadoop-test and i also had to add
> jsr311api to make it work.
> ls lib/test gives:
>
> ant-1.6.5.jar
> commons-beanutils-1.7.0.jar
> commons-beanutils-core-1.8.0.jar
> commons-cli-1.2.jar
> commons-codec-1.4.jar
> commons-collections-3.2.1.jar
> commons-configuration-1.6.jar
> commons-digester-1.8.jar
> commons-el-1.0.jar
> commons-httpclient-3.0.1.jar
> commons-lang-2.4.jar
> commons-logging-1.1.1.jar
> commons-math-2.1.jar
> commons-net-1.4.1.jar
> core-3.1.1.jar
> ftplet-api-1.0.0.jar
> ftpserver-core-1.0.0.jar
> ftpserver-deprecated-1.0.0-M2.jar
> guava-r06.jar
> hadoop-core-1.0.4.jar
> hadoop-test-1.0.4.jar
> hsqldb-1.8.0.10.jar
> jackson-core-asl-1.0.1.jar
> jackson-mapper-asl-1.0.1.jar
> jasper-compiler-5.5.12.jar
> jasper-runtime-5.5.12.jar
> jets3t-0.7.1.jar
> jetty-6.1.26.jar
> jetty-util-6.1.26.jar
> joda-time-2.1.jar
> jsp-2.1-6.1.14.jar
> jsp-api-2.1-6.1.14.jar
> jsr311-api-1.1.1.jar
> junit-3.8.1.jar
> kfs-0.3.jar
> log4j-1.2.16.jar
> mina-core-2.0.0-M5.jar
> oro-2.0.8.jar
> servlet-api-2.5-20081211.jar
> servlet-api-2.5-6.1.14.jar
> slf4j-api-1.6.1.jar
> slf4j-log4j12-1.6.1.jar
> xmlenc-0.52.jar
>
>
> On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>>
>>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>>> the unit test like this:
>>>
>>> private static MiniDFSCluster dfsCluster;
>>> private static FileSystem fs;
>>> private static MiniMRCluster mrCluster;
>>>
>>> @BeforeClass
>>> public static void oneTimeSetUp() throws Exception {
>>> new File("tmp/logs").mkdirs();
>>> System.setProperty("hadoop.log.dir", "tmp/logs");
>>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>>
>>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>>> final Configuration conf = new Configuration();
>>> conf.set("io.skip.checksum.errors", "true");
>>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>>> fs = dfsCluster.getFileSystem();
>>> mrCluster = new MiniMRCluster(2,
>>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>>> org.apache.hadoop.mapred.JobConf(conf));
>>> }
>>>
>>> however with hadoop 1.0.4 this results in an exception where the
>>> jobtracker dies:
>>>
>>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>>> at
>>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>>> at
>>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>>
>>> not sure what this error is about, but i managed to get around it for
>>> now by removing the 2 hadoop apache jasper dependencies which seem to have
>>> something to do with it: jasper-compiler-5.5.12.jar and
>>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>>> starts, but now all the tasks fail with messages like this:
>>>
>>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>>
>>> am i doing something completely wrong, and was i just lucky that this
>>> used to run with CDH3? any ideas? thanks! koert
>>>
>>>
>> This sounds like some classpath/versioning thing -MiniMR cluster works
>> for me; I even fixed that system property setup in trunk as I got fed up
>> with having to do it myself.
>>
>> What is your CP in your test run?
>>
>
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
I don't see anything that obvious, except there are two servlet API JARS
-one of those should go
On 10 November 2012 19:42, Koert Kuipers <ko...@tresata.com> wrote:
> ant sets classpath for me but i have same issues when i set it myself.
> for example i can run it like this:
> java -cp lib/test/*:build/classes:build/test
> com.tresata.hadoop.mapred.MapRedTest
>
> my dependencies are hadoop-core, hadoop-test and i also had to add
> jsr311api to make it work.
> ls lib/test gives:
>
> ant-1.6.5.jar
> commons-beanutils-1.7.0.jar
> commons-beanutils-core-1.8.0.jar
> commons-cli-1.2.jar
> commons-codec-1.4.jar
> commons-collections-3.2.1.jar
> commons-configuration-1.6.jar
> commons-digester-1.8.jar
> commons-el-1.0.jar
> commons-httpclient-3.0.1.jar
> commons-lang-2.4.jar
> commons-logging-1.1.1.jar
> commons-math-2.1.jar
> commons-net-1.4.1.jar
> core-3.1.1.jar
> ftplet-api-1.0.0.jar
> ftpserver-core-1.0.0.jar
> ftpserver-deprecated-1.0.0-M2.jar
> guava-r06.jar
> hadoop-core-1.0.4.jar
> hadoop-test-1.0.4.jar
> hsqldb-1.8.0.10.jar
> jackson-core-asl-1.0.1.jar
> jackson-mapper-asl-1.0.1.jar
> jasper-compiler-5.5.12.jar
> jasper-runtime-5.5.12.jar
> jets3t-0.7.1.jar
> jetty-6.1.26.jar
> jetty-util-6.1.26.jar
> joda-time-2.1.jar
> jsp-2.1-6.1.14.jar
> jsp-api-2.1-6.1.14.jar
> jsr311-api-1.1.1.jar
> junit-3.8.1.jar
> kfs-0.3.jar
> log4j-1.2.16.jar
> mina-core-2.0.0-M5.jar
> oro-2.0.8.jar
> servlet-api-2.5-20081211.jar
> servlet-api-2.5-6.1.14.jar
> slf4j-api-1.6.1.jar
> slf4j-log4j12-1.6.1.jar
> xmlenc-0.52.jar
>
>
> On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>>
>>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>>> the unit test like this:
>>>
>>> private static MiniDFSCluster dfsCluster;
>>> private static FileSystem fs;
>>> private static MiniMRCluster mrCluster;
>>>
>>> @BeforeClass
>>> public static void oneTimeSetUp() throws Exception {
>>> new File("tmp/logs").mkdirs();
>>> System.setProperty("hadoop.log.dir", "tmp/logs");
>>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>>
>>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>>> final Configuration conf = new Configuration();
>>> conf.set("io.skip.checksum.errors", "true");
>>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>>> fs = dfsCluster.getFileSystem();
>>> mrCluster = new MiniMRCluster(2,
>>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>>> org.apache.hadoop.mapred.JobConf(conf));
>>> }
>>>
>>> however with hadoop 1.0.4 this results in an exception where the
>>> jobtracker dies:
>>>
>>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>>> at
>>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>>> at
>>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>>
>>> not sure what this error is about, but i managed to get around it for
>>> now by removing the 2 hadoop apache jasper dependencies which seem to have
>>> something to do with it: jasper-compiler-5.5.12.jar and
>>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>>> starts, but now all the tasks fail with messages like this:
>>>
>>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>>
>>> am i doing something completely wrong, and was i just lucky that this
>>> used to run with CDH3? any ideas? thanks! koert
>>>
>>>
>> This sounds like some classpath/versioning thing -MiniMR cluster works
>> for me; I even fixed that system property setup in trunk as I got fed up
>> with having to do it myself.
>>
>> What is your CP in your test run?
>>
>
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
I don't see anything that obvious, except there are two servlet API JARS
-one of those should go
On 10 November 2012 19:42, Koert Kuipers <ko...@tresata.com> wrote:
> ant sets classpath for me but i have same issues when i set it myself.
> for example i can run it like this:
> java -cp lib/test/*:build/classes:build/test
> com.tresata.hadoop.mapred.MapRedTest
>
> my dependencies are hadoop-core, hadoop-test and i also had to add
> jsr311api to make it work.
> ls lib/test gives:
>
> ant-1.6.5.jar
> commons-beanutils-1.7.0.jar
> commons-beanutils-core-1.8.0.jar
> commons-cli-1.2.jar
> commons-codec-1.4.jar
> commons-collections-3.2.1.jar
> commons-configuration-1.6.jar
> commons-digester-1.8.jar
> commons-el-1.0.jar
> commons-httpclient-3.0.1.jar
> commons-lang-2.4.jar
> commons-logging-1.1.1.jar
> commons-math-2.1.jar
> commons-net-1.4.1.jar
> core-3.1.1.jar
> ftplet-api-1.0.0.jar
> ftpserver-core-1.0.0.jar
> ftpserver-deprecated-1.0.0-M2.jar
> guava-r06.jar
> hadoop-core-1.0.4.jar
> hadoop-test-1.0.4.jar
> hsqldb-1.8.0.10.jar
> jackson-core-asl-1.0.1.jar
> jackson-mapper-asl-1.0.1.jar
> jasper-compiler-5.5.12.jar
> jasper-runtime-5.5.12.jar
> jets3t-0.7.1.jar
> jetty-6.1.26.jar
> jetty-util-6.1.26.jar
> joda-time-2.1.jar
> jsp-2.1-6.1.14.jar
> jsp-api-2.1-6.1.14.jar
> jsr311-api-1.1.1.jar
> junit-3.8.1.jar
> kfs-0.3.jar
> log4j-1.2.16.jar
> mina-core-2.0.0-M5.jar
> oro-2.0.8.jar
> servlet-api-2.5-20081211.jar
> servlet-api-2.5-6.1.14.jar
> slf4j-api-1.6.1.jar
> slf4j-log4j12-1.6.1.jar
> xmlenc-0.52.jar
>
>
> On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>>
>>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>>> the unit test like this:
>>>
>>> private static MiniDFSCluster dfsCluster;
>>> private static FileSystem fs;
>>> private static MiniMRCluster mrCluster;
>>>
>>> @BeforeClass
>>> public static void oneTimeSetUp() throws Exception {
>>> new File("tmp/logs").mkdirs();
>>> System.setProperty("hadoop.log.dir", "tmp/logs");
>>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>>
>>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>>> final Configuration conf = new Configuration();
>>> conf.set("io.skip.checksum.errors", "true");
>>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>>> fs = dfsCluster.getFileSystem();
>>> mrCluster = new MiniMRCluster(2,
>>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>>> org.apache.hadoop.mapred.JobConf(conf));
>>> }
>>>
>>> however with hadoop 1.0.4 this results in an exception where the
>>> jobtracker dies:
>>>
>>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>>> at
>>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>>> at
>>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>>
>>> not sure what this error is about, but i managed to get around it for
>>> now by removing the 2 hadoop apache jasper dependencies which seem to have
>>> something to do with it: jasper-compiler-5.5.12.jar and
>>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>>> starts, but now all the tasks fail with messages like this:
>>>
>>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>>
>>> am i doing something completely wrong, and was i just lucky that this
>>> used to run with CDH3? any ideas? thanks! koert
>>>
>>>
>> This sounds like some classpath/versioning thing -MiniMR cluster works
>> for me; I even fixed that system property setup in trunk as I got fed up
>> with having to do it myself.
>>
>> What is your CP in your test run?
>>
>
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Koert Kuipers <ko...@tresata.com>.
ant sets classpath for me but i have same issues when i set it myself.
for example i can run it like this:
java -cp lib/test/*:build/classes:build/test
com.tresata.hadoop.mapred.MapRedTest
my dependencies are hadoop-core, hadoop-test and i also had to add
jsr311api to make it work.
ls lib/test gives:
ant-1.6.5.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
commons-codec-1.4.jar
commons-collections-3.2.1.jar
commons-configuration-1.6.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.0.1.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
commons-net-1.4.1.jar
core-3.1.1.jar
ftplet-api-1.0.0.jar
ftpserver-core-1.0.0.jar
ftpserver-deprecated-1.0.0-M2.jar
guava-r06.jar
hadoop-core-1.0.4.jar
hadoop-test-1.0.4.jar
hsqldb-1.8.0.10.jar
jackson-core-asl-1.0.1.jar
jackson-mapper-asl-1.0.1.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
jets3t-0.7.1.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
joda-time-2.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsr311-api-1.1.1.jar
junit-3.8.1.jar
kfs-0.3.jar
log4j-1.2.16.jar
mina-core-2.0.0-M5.jar
oro-2.0.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5-6.1.14.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar
xmlenc-0.52.jar
On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>
> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>
>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>> the unit test like this:
>>
>> private static MiniDFSCluster dfsCluster;
>> private static FileSystem fs;
>> private static MiniMRCluster mrCluster;
>>
>> @BeforeClass
>> public static void oneTimeSetUp() throws Exception {
>> new File("tmp/logs").mkdirs();
>> System.setProperty("hadoop.log.dir", "tmp/logs");
>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>
>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>> final Configuration conf = new Configuration();
>> conf.set("io.skip.checksum.errors", "true");
>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>> fs = dfsCluster.getFileSystem();
>> mrCluster = new MiniMRCluster(2,
>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>> org.apache.hadoop.mapred.JobConf(conf));
>> }
>>
>> however with hadoop 1.0.4 this results in an exception where the
>> jobtracker dies:
>>
>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>> at
>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>> at
>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>
>> not sure what this error is about, but i managed to get around it for now
>> by removing the 2 hadoop apache jasper dependencies which seem to have
>> something to do with it: jasper-compiler-5.5.12.jar and
>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>> starts, but now all the tasks fail with messages like this:
>>
>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>
>> am i doing something completely wrong, and was i just lucky that this
>> used to run with CDH3? any ideas? thanks! koert
>>
>>
> This sounds like some classpath/versioning thing -MiniMR cluster works for
> me; I even fixed that system property setup in trunk as I got fed up with
> having to do it myself.
>
> What is your CP in your test run?
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Koert Kuipers <ko...@tresata.com>.
ant sets classpath for me but i have same issues when i set it myself.
for example i can run it like this:
java -cp lib/test/*:build/classes:build/test
com.tresata.hadoop.mapred.MapRedTest
my dependencies are hadoop-core, hadoop-test and i also had to add
jsr311api to make it work.
ls lib/test gives:
ant-1.6.5.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
commons-codec-1.4.jar
commons-collections-3.2.1.jar
commons-configuration-1.6.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.0.1.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
commons-net-1.4.1.jar
core-3.1.1.jar
ftplet-api-1.0.0.jar
ftpserver-core-1.0.0.jar
ftpserver-deprecated-1.0.0-M2.jar
guava-r06.jar
hadoop-core-1.0.4.jar
hadoop-test-1.0.4.jar
hsqldb-1.8.0.10.jar
jackson-core-asl-1.0.1.jar
jackson-mapper-asl-1.0.1.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
jets3t-0.7.1.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
joda-time-2.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsr311-api-1.1.1.jar
junit-3.8.1.jar
kfs-0.3.jar
log4j-1.2.16.jar
mina-core-2.0.0-M5.jar
oro-2.0.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5-6.1.14.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar
xmlenc-0.52.jar
On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>
> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>
>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>> the unit test like this:
>>
>> private static MiniDFSCluster dfsCluster;
>> private static FileSystem fs;
>> private static MiniMRCluster mrCluster;
>>
>> @BeforeClass
>> public static void oneTimeSetUp() throws Exception {
>> new File("tmp/logs").mkdirs();
>> System.setProperty("hadoop.log.dir", "tmp/logs");
>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>
>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>> final Configuration conf = new Configuration();
>> conf.set("io.skip.checksum.errors", "true");
>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>> fs = dfsCluster.getFileSystem();
>> mrCluster = new MiniMRCluster(2,
>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>> org.apache.hadoop.mapred.JobConf(conf));
>> }
>>
>> however with hadoop 1.0.4 this results in an exception where the
>> jobtracker dies:
>>
>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>> at
>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>> at
>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>
>> not sure what this error is about, but i managed to get around it for now
>> by removing the 2 hadoop apache jasper dependencies which seem to have
>> something to do with it: jasper-compiler-5.5.12.jar and
>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>> starts, but now all the tasks fail with messages like this:
>>
>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>
>> am i doing something completely wrong, and was i just lucky that this
>> used to run with CDH3? any ideas? thanks! koert
>>
>>
> This sounds like some classpath/versioning thing -MiniMR cluster works for
> me; I even fixed that system property setup in trunk as I got fed up with
> having to do it myself.
>
> What is your CP in your test run?
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Koert Kuipers <ko...@tresata.com>.
ant sets classpath for me but i have same issues when i set it myself.
for example i can run it like this:
java -cp lib/test/*:build/classes:build/test
com.tresata.hadoop.mapred.MapRedTest
my dependencies are hadoop-core, hadoop-test and i also had to add
jsr311api to make it work.
ls lib/test gives:
ant-1.6.5.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
commons-codec-1.4.jar
commons-collections-3.2.1.jar
commons-configuration-1.6.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.0.1.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
commons-net-1.4.1.jar
core-3.1.1.jar
ftplet-api-1.0.0.jar
ftpserver-core-1.0.0.jar
ftpserver-deprecated-1.0.0-M2.jar
guava-r06.jar
hadoop-core-1.0.4.jar
hadoop-test-1.0.4.jar
hsqldb-1.8.0.10.jar
jackson-core-asl-1.0.1.jar
jackson-mapper-asl-1.0.1.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
jets3t-0.7.1.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
joda-time-2.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsr311-api-1.1.1.jar
junit-3.8.1.jar
kfs-0.3.jar
log4j-1.2.16.jar
mina-core-2.0.0-M5.jar
oro-2.0.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5-6.1.14.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar
xmlenc-0.52.jar
On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>
> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>
>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>> the unit test like this:
>>
>> private static MiniDFSCluster dfsCluster;
>> private static FileSystem fs;
>> private static MiniMRCluster mrCluster;
>>
>> @BeforeClass
>> public static void oneTimeSetUp() throws Exception {
>> new File("tmp/logs").mkdirs();
>> System.setProperty("hadoop.log.dir", "tmp/logs");
>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>
>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>> final Configuration conf = new Configuration();
>> conf.set("io.skip.checksum.errors", "true");
>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>> fs = dfsCluster.getFileSystem();
>> mrCluster = new MiniMRCluster(2,
>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>> org.apache.hadoop.mapred.JobConf(conf));
>> }
>>
>> however with hadoop 1.0.4 this results in an exception where the
>> jobtracker dies:
>>
>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>> at
>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>> at
>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>
>> not sure what this error is about, but i managed to get around it for now
>> by removing the 2 hadoop apache jasper dependencies which seem to have
>> something to do with it: jasper-compiler-5.5.12.jar and
>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>> starts, but now all the tasks fail with messages like this:
>>
>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>
>> am i doing something completely wrong, and was i just lucky that this
>> used to run with CDH3? any ideas? thanks! koert
>>
>>
> This sounds like some classpath/versioning thing -MiniMR cluster works for
> me; I even fixed that system property setup in trunk as I got fed up with
> having to do it myself.
>
> What is your CP in your test run?
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Koert Kuipers <ko...@tresata.com>.
ant sets classpath for me but i have same issues when i set it myself.
for example i can run it like this:
java -cp lib/test/*:build/classes:build/test
com.tresata.hadoop.mapred.MapRedTest
my dependencies are hadoop-core, hadoop-test and i also had to add
jsr311api to make it work.
ls lib/test gives:
ant-1.6.5.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
commons-codec-1.4.jar
commons-collections-3.2.1.jar
commons-configuration-1.6.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.0.1.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
commons-net-1.4.1.jar
core-3.1.1.jar
ftplet-api-1.0.0.jar
ftpserver-core-1.0.0.jar
ftpserver-deprecated-1.0.0-M2.jar
guava-r06.jar
hadoop-core-1.0.4.jar
hadoop-test-1.0.4.jar
hsqldb-1.8.0.10.jar
jackson-core-asl-1.0.1.jar
jackson-mapper-asl-1.0.1.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
jets3t-0.7.1.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
joda-time-2.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsr311-api-1.1.1.jar
junit-3.8.1.jar
kfs-0.3.jar
log4j-1.2.16.jar
mina-core-2.0.0-M5.jar
oro-2.0.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5-6.1.14.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar
xmlenc-0.52.jar
On Sat, Nov 10, 2012 at 5:24 AM, Steve Loughran <st...@hortonworks.com>wrote:
>
>
> On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
>
>> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
>> unit tests used to run fine but are now misbehaving. i do a simple setup in
>> the unit test like this:
>>
>> private static MiniDFSCluster dfsCluster;
>> private static FileSystem fs;
>> private static MiniMRCluster mrCluster;
>>
>> @BeforeClass
>> public static void oneTimeSetUp() throws Exception {
>> new File("tmp/logs").mkdirs();
>> System.setProperty("hadoop.log.dir", "tmp/logs");
>> System.setProperty("javax.xml.parsers.SAXParserFactory",
>>
>> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
>> final Configuration conf = new Configuration();
>> conf.set("io.skip.checksum.errors", "true");
>> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
>> fs = dfsCluster.getFileSystem();
>> mrCluster = new MiniMRCluster(2,
>> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
>> org.apache.hadoop.mapred.JobConf(conf));
>> }
>>
>> however with hadoop 1.0.4 this results in an exception where the
>> jobtracker dies:
>>
>> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
>> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
>> at
>> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
>> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
>> at
>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>>
>> not sure what this error is about, but i managed to get around it for now
>> by removing the 2 hadoop apache jasper dependencies which seem to have
>> something to do with it: jasper-compiler-5.5.12.jar and
>> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
>> starts, but now all the tasks fail with messages like this:
>>
>> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
>> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>>
>> am i doing something completely wrong, and was i just lucky that this
>> used to run with CDH3? any ideas? thanks! koert
>>
>>
> This sounds like some classpath/versioning thing -MiniMR cluster works for
> me; I even fixed that system property setup in trunk as I got fed up with
> having to do it myself.
>
> What is your CP in your test run?
>
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
> unit tests used to run fine but are now misbehaving. i do a simple setup in
> the unit test like this:
>
> private static MiniDFSCluster dfsCluster;
> private static FileSystem fs;
> private static MiniMRCluster mrCluster;
>
> @BeforeClass
> public static void oneTimeSetUp() throws Exception {
> new File("tmp/logs").mkdirs();
> System.setProperty("hadoop.log.dir", "tmp/logs");
> System.setProperty("javax.xml.parsers.SAXParserFactory",
>
> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
> final Configuration conf = new Configuration();
> conf.set("io.skip.checksum.errors", "true");
> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
> fs = dfsCluster.getFileSystem();
> mrCluster = new MiniMRCluster(2,
> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
> org.apache.hadoop.mapred.JobConf(conf));
> }
>
> however with hadoop 1.0.4 this results in an exception where the
> jobtracker dies:
>
> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
> at
> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
> at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>
> not sure what this error is about, but i managed to get around it for now
> by removing the 2 hadoop apache jasper dependencies which seem to have
> something to do with it: jasper-compiler-5.5.12.jar and
> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
> starts, but now all the tasks fail with messages like this:
>
> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>
> am i doing something completely wrong, and was i just lucky that this used
> to run with CDH3? any ideas? thanks! koert
>
>
This sounds like some classpath/versioning thing -MiniMR cluster works for
me; I even fixed that system property setup in trunk as I got fed up with
having to do it myself.
What is your CP in your test run?
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
> unit tests used to run fine but are now misbehaving. i do a simple setup in
> the unit test like this:
>
> private static MiniDFSCluster dfsCluster;
> private static FileSystem fs;
> private static MiniMRCluster mrCluster;
>
> @BeforeClass
> public static void oneTimeSetUp() throws Exception {
> new File("tmp/logs").mkdirs();
> System.setProperty("hadoop.log.dir", "tmp/logs");
> System.setProperty("javax.xml.parsers.SAXParserFactory",
>
> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
> final Configuration conf = new Configuration();
> conf.set("io.skip.checksum.errors", "true");
> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
> fs = dfsCluster.getFileSystem();
> mrCluster = new MiniMRCluster(2,
> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
> org.apache.hadoop.mapred.JobConf(conf));
> }
>
> however with hadoop 1.0.4 this results in an exception where the
> jobtracker dies:
>
> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
> at
> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
> at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>
> not sure what this error is about, but i managed to get around it for now
> by removing the 2 hadoop apache jasper dependencies which seem to have
> something to do with it: jasper-compiler-5.5.12.jar and
> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
> starts, but now all the tasks fail with messages like this:
>
> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>
> am i doing something completely wrong, and was i just lucky that this used
> to run with CDH3? any ideas? thanks! koert
>
>
This sounds like some classpath/versioning thing -MiniMR cluster works for
me; I even fixed that system property setup in trunk as I got fed up with
having to do it myself.
What is your CP in your test run?
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
> unit tests used to run fine but are now misbehaving. i do a simple setup in
> the unit test like this:
>
> private static MiniDFSCluster dfsCluster;
> private static FileSystem fs;
> private static MiniMRCluster mrCluster;
>
> @BeforeClass
> public static void oneTimeSetUp() throws Exception {
> new File("tmp/logs").mkdirs();
> System.setProperty("hadoop.log.dir", "tmp/logs");
> System.setProperty("javax.xml.parsers.SAXParserFactory",
>
> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
> final Configuration conf = new Configuration();
> conf.set("io.skip.checksum.errors", "true");
> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
> fs = dfsCluster.getFileSystem();
> mrCluster = new MiniMRCluster(2,
> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
> org.apache.hadoop.mapred.JobConf(conf));
> }
>
> however with hadoop 1.0.4 this results in an exception where the
> jobtracker dies:
>
> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
> at
> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
> at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>
> not sure what this error is about, but i managed to get around it for now
> by removing the 2 hadoop apache jasper dependencies which seem to have
> something to do with it: jasper-compiler-5.5.12.jar and
> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
> starts, but now all the tasks fail with messages like this:
>
> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>
> am i doing something completely wrong, and was i just lucky that this used
> to run with CDH3? any ideas? thanks! koert
>
>
This sounds like some classpath/versioning thing -MiniMR cluster works for
me; I even fixed that system property setup in trunk as I got fed up with
having to do it myself.
What is your CP in your test run?
Re: MiniMRCluster not behaving in hadoop 1.0.4
Posted by Steve Loughran <st...@hortonworks.com>.
On 10 November 2012 07:43, Koert Kuipers <ko...@tresata.com> wrote:
> i am porting a map-reduce library from CDH3 to apache hadoop 1.0.4. the
> unit tests used to run fine but are now misbehaving. i do a simple setup in
> the unit test like this:
>
> private static MiniDFSCluster dfsCluster;
> private static FileSystem fs;
> private static MiniMRCluster mrCluster;
>
> @BeforeClass
> public static void oneTimeSetUp() throws Exception {
> new File("tmp/logs").mkdirs();
> System.setProperty("hadoop.log.dir", "tmp/logs");
> System.setProperty("javax.xml.parsers.SAXParserFactory",
>
> "com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl");
> final Configuration conf = new Configuration();
> conf.set("io.skip.checksum.errors", "true");
> dfsCluster = new MiniDFSCluster(conf, 2, true, null);
> fs = dfsCluster.getFileSystem();
> mrCluster = new MiniMRCluster(2,
> dfsCluster.getFileSystem().getUri().toString(), 1, null, null, new
> org.apache.hadoop.mapred.JobConf(conf));
> }
>
> however with hadoop 1.0.4 this results in an exception where the
> jobtracker dies:
>
> 12/11/10 02:38:29 ERROR mapred.MiniMRCluster: Job tracker crashed
> java.lang.NoSuchFieldError: IS_SECURITY_ENABLED
> at
> org.apache.jasper.compiler.JspRuntimeContext.<init>(JspRuntimeContext.java:197)
> at org.apache.jasper.servlet.JspServlet.init(JspServlet.java:150)
> at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:440)
>
> not sure what this error is about, but i managed to get around it for now
> by removing the 2 hadoop apache jasper dependencies which seem to have
> something to do with it: jasper-compiler-5.5.12.jar and
> jasper-runtime-5.5.12.jar. with this "fix" i get further and my jobs
> starts, but now all the tasks fail with messages like this:
>
> attempt_20121110024037623_0001_r_000002_0: Exception in thread "main"
> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/Child
>
> am i doing something completely wrong, and was i just lucky that this used
> to run with CDH3? any ideas? thanks! koert
>
>
This sounds like some classpath/versioning thing -MiniMR cluster works for
me; I even fixed that system property setup in trunk as I got fed up with
having to do it myself.
What is your CP in your test run?