You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Steve Loughran (JIRA)" <ji...@apache.org> on 2008/09/01 15:46:44 UTC

[jira] Commented: (HADOOP-4052) org.apache.hadoop.streaming.TestUlimit fails on JRockit 64-bit; not enough memory

    [ https://issues.apache.org/jira/browse/HADOOP-4052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12627457#action_12627457 ] 

Steve Loughran commented on HADOOP-4052:
----------------------------------------

stack trace in the test

08/09/01 13:47:05 [Thread-84] INFO mapred.TaskTracker : LaunchTaskAction: attempt_200809011346_0001_m_000002_0
08/09/01 13:47:05 [IPC Server handler 9 on 43325] INFO mapred.TaskInProgress : Error from attempt_200809011346_0001_m_000001_0: java.io.IOException: Task process exit with nonzero status of 1: bash -c "'ulimit' '-v' '786432' ;exec '/usr/java/jrockit-1.6.0_02/jre/bin/java' '-Djava.library.path=/usr/java/jrockit-1.6.0_02/jre/lib/amd64/jrockit:/usr/java/jrockit-1.6.0_02/jre/lib/amd64:/usr/java/jrockit-1.6.0_02/jre/../lib/amd64:/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/mapred/local/1_0/taskTracker/jobcache/job_200809011346_0001/attempt_200809011346_0001_m_000001_0/work' '-Xmx200m' '-Djava.io.tmpdir=/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/mapred/local/1_0/taskTracker/jobcache/job_200809011346_0001/attempt_200809011346_0001_m_000001_0/work/tmp' '-classpath' '/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test:/home/slo/Java/Apache/hadoop-core/build/test/classes:/home/slo/Java/Apache/hadoop-core/src/contrib/test:/home/slo/Java/Apache/hadoop-core/conf:/home/slo/Java/Apache/hadoop-core/build:/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/examples:/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/classes:/home/slo/Java/Apache/hadoop-core/build/classes:/home/slo/Java/Apache/hadoop-core/lib/commons-cli-2.0-SNAPSHOT.jar:/home/slo/Java/Apache/hadoop-core/lib/commons-codec-1.3.jar:/home/slo/Java/Apache/hadoop-core/lib/commons-httpclient-3.0.1.jar:/home/slo/Java/Apache/hadoop-core/lib/commons-logging-1.0.4.jar:/home/slo/Java/Apache/hadoop-core/lib/commons-logging-api-1.0.4.jar:/home/slo/Java/Apache/hadoop-core/lib/commons-net-1.4.1.jar:/home/slo/Java/Apache/hadoop-core/lib/jets3t-0.6.1.jar:/home/slo/Java/Apache/hadoop-core/lib/jetty-5.1.4.jar:/home/slo/Java/Apache/hadoop-core/lib/jetty-ext/commons-el.jar:/home/slo/Java/Apache/hadoop-core/lib/jetty-ext/jasper-compiler.jar:/home/slo/Java/Apache/hadoop-core/lib/jetty-ext/jasper-runtime.jar:/home/slo/Java/Apache/hadoop-core/lib/jetty-ext/jsp-api.jar:/home/slo/Java/Apache/hadoop-core/lib/junit-3.8.1.jar:/home/slo/Java/Apache/hadoop-core/lib/kfs-0.2.0.jar:/home/slo/Java/Apache/hadoop-core/lib/log4j-1.2.15.jar:/home/slo/Java/Apache/hadoop-core/lib/oro-2.0.8.jar:/home/slo/Java/Apache/hadoop-core/lib/servlet-api.jar:/home/slo/Java/Apache/hadoop-core/lib/slf4j-api-1.4.3.jar:/home/slo/Java/Apache/hadoop-core/lib/slf4j-log4j12-1.4.3.jar:/home/slo/Java/Apache/hadoop-core/lib/xmlenc-0.52.jar:/home/slo/Java/Apache/ant/lib/junit-3.8.2.jar:/home/slo/Java/Apache/ant/lib/ant-launcher.jar:/home/slo/Java/Apache/ant/lib/ant.jar:/home/slo/Java/Apache/ant/lib/ant-junit.jar::/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/mapred/local/1_0/taskTracker/jobcache/job_200809011346_0001/jars/classes:/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/mapred/local/1_0/taskTracker/jobcache/job_200809011346_0001/jars:/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/mapred/local/1_0/taskTracker/jobcache/job_200809011346_0001/attempt_200809011346_0001_m_000001_0/work' '-Dhadoop.log.dir=/home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/logs' '-Dhadoop.root.logger=INFO,TLA' '-Dhadoop.tasklog.taskid=attempt_200809011346_0001_m_000001_0' '-Dhadoop.tasklog.totalLogFileSize=0' 'org.apache.hadoop.mapred.TaskTracker$Child' '127.0.0.1' '60775' 'attempt_200809011346_0001_m_000001_0'  < /dev/null  1>> /home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/logs/userlogs/attempt_200809011346_0001_m_000001_0/stdout 2>> /home/slo/Java/Apache/hadoop-core/build/contrib/streaming/test/logs/userlogs/attempt_200809011346_0001_m_000001_0/stderr" 
	at org.apache.hadoop.mapred.TaskRunner.runChild(TaskRunner.java:463)
	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:404)

> org.apache.hadoop.streaming.TestUlimit fails on JRockit 64-bit; not enough memory
> ---------------------------------------------------------------------------------
>
>                 Key: HADOOP-4052
>                 URL: https://issues.apache.org/jira/browse/HADOOP-4052
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: contrib/streaming
>    Affects Versions: 0.19.0
>         Environment: Linux morzine 2.6.22-15-generic #1 SMP Fri Jul 11 18:56:36 UTC 2008 x86_64 GNU/Linux
> java version "1.6.0_02"
> Java(TM) SE Runtime Environment (build 1.6.0_02-b05)
> BEA JRockit(R) (build R27.4.0-90-89592-1.6.0_02-20070928-1715-linux-x86_64, compiled mode)
>            Reporter: Steve Loughran
>
> the testUlimit test sets a memory limit that is too small for Java to start. So it fails with a -1 response instead, which breaks the test. 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.