You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@knox.apache.org by "Amit Kamble (JIRA)" <ji...@apache.org> on 2013/10/07 13:25:43 UTC

[jira] [Comment Edited] (KNOX-180) Server error while submitting mapreduce job with knox-0.2.0

    [ https://issues.apache.org/jira/browse/KNOX-180?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13788078#comment-13788078 ] 

Amit Kamble edited comment on KNOX-180 at 10/7/13 11:24 AM:
------------------------------------------------------------

Thanks Kevin for suggestions.

Please find below logs generated by WebHCat :

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hcatalog/share/webhcat/svr/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hcatalog/share/webhcat/svr/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
log4j:ERROR Failed to flush writer,
java.io.IOException: No space left on device
	at java.io.FileOutputStream.writeBytes(Native Method)
	at java.io.FileOutputStream.write(FileOutputStream.java:282)
	at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:202)
	at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:272)
	at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:276)
	at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:122)
	at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:212)
	at org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:58)
	at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:316)
	at org.apache.log4j.DailyRollingFileAppender.subAppend(DailyRollingFileAppender.java:359)
	at org.apache.log4j.WriterAppender.append(WriterAppender.java:160)
	at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
	at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
	at org.apache.log4j.Category.callAppenders(Category.java:206)
	at org.apache.log4j.Category.forcedLog(Category.java:391)
	at org.apache.log4j.Category.log(Category.java:856)
	at org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:133)
	at org.apache.hcatalog.templeton.tool.ZooKeeperCleanup.getChildList(ZooKeeperCleanup.java:149)
	at org.apache.hcatalog.templeton.tool.ZooKeeperCleanup.run(ZooKeeperCleanup.java:102)


Logs generated by gateway server after setting logging level to DEBUG:

bash-4.1# java -jar bin/shell.jar
knox:000> import com.jayway.jsonpath.JsonPath
knox:000> import org.apache.hadoop.gateway.shell.Hadoop
knox:000> import org.apache.hadoop.gateway.shell.hdfs.Hdfs
knox:000> import org.apache.hadoop.gateway.shell.job.Job
knox:000> import static java.util.concurrent.TimeUnit.SECONDS
knox:000> gateway = "https://localhost:8443/gateway/sample"
knox:000> username = "mapred"
knox:000> password = "mapred-password"
knox:000> dataFile = "LICENSE"
knox:000> jarFile = "samples/hadoop-examples.jar"
knox:000>
knox:000>
knox:000> hadoop = Hadoop.login( gateway, username, password )
knox:000> println "Delete /tmp/test " + Hdfs.rm(hadoop).file("/tmp/test").recursive().now().statusCode
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: DELETE http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&recursive=true&op=DELETE
Delete /tmp/test 200
knox:000> println "Create /tmp/test " + Hdfs.mkdir(hadoop).dir( "/tmp/test").now().statusCode
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&op=MKDIRS
Create /tmp/test 200
knox:000> putData = Hdfs.put(hadoop).file( dataFile ).to( "/tmp/test/input/FILE" ).later() {
knox:001> println "Put /tmp/test/input/FILE " + it.statusCode }
knox:000> putJar = Hdfs.put(hadoop).file( jarFile ).to( "/tmp/test/hadoop-examples.jar" ).later() {
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&op=CREATE
knox:001> println "Put /tmp/test/hadoop-examples.jar " + it.statusCode }
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50075/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&overwrite=false&op=CREATE
knox:000> hadoop.waitFor( putData, putJar )
Put /tmp/test/input/FILE 201
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&op=CREATE
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50075/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&overwrite=false&op=CREATE
Put /tmp/test/hadoop-examples.jar 201
knox:000> jobId = Job.submitJava(hadoop).jar( "/tmp/test/hadoop-examples.jar" ).app( "wordcount" ).input( "/tmp/test/input" ).output( "/tmp/test/output" ).now().jobId
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: POST http://localhost:50111/templeton/v1/mapreduce/jar
ERROR org.apache.hadoop.gateway.shell.HadoopException:
org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
        at org.apache.hadoop.gateway.shell.AbstractRequest.now (AbstractRequest.java:72)
        at org.apache.hadoop.gateway.shell.AbstractRequest$now.call (Unknown Source)
        at groovysh_evaluate.run (groovysh_evaluate:8)
        ...
knox:000> println "Submitted job " + jobId
ERROR groovy.lang.MissingPropertyException:
No such property: jobId for class: groovysh_evaluate
        at groovysh_evaluate.run (groovysh_evaluate:8)
        ...
knox:000> done = false
knox:000> count = 0
knox:000> while( !done && count++ < 60 ) {
knox:001> sleep( 1000 )
knox:002> json = Job.queryStatus(hadoop).jobId(jobId).now().string
knox:003> done = JsonPath.read( json, "${SDS}.status.jobComplete" )
knox:004> }
ERROR groovy.lang.MissingPropertyException:
No such property: jobId for class: groovysh_evaluate
        at groovysh_evaluate.run (groovysh_evaluate:10)
        ...
knox:004> println "Done " + done
knox:005> println "Shutdown " + hadoop.shutdown( 10, SECONDS )
knox:006> exit



was (Author: amitkamble):
Thanks Kevin for suggestions.

Please find below logs generated by WebHCat :

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hcatalog/share/webhcat/svr/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hcatalog/share/webhcat/svr/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
log4j:ERROR Failed to flush writer,
java.io.IOException: No space left on device
	at java.io.FileOutputStream.writeBytes(Native Method)
	at java.io.FileOutputStream.write(FileOutputStream.java:282)
	at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:202)
	at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:272)
	at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:276)
	at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:122)
	at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:212)
	at org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:58)
	at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:316)
	at org.apache.log4j.DailyRollingFileAppender.subAppend(DailyRollingFileAppender.java:359)
	at org.apache.log4j.WriterAppender.append(WriterAppender.java:160)
	at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
	at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
	at org.apache.log4j.Category.callAppenders(Category.java:206)
	at org.apache.log4j.Category.forcedLog(Category.java:391)
	at org.apache.log4j.Category.log(Category.java:856)
	at org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:133)
	at org.apache.hcatalog.templeton.tool.ZooKeeperCleanup.getChildList(ZooKeeperCleanup.java:149)
	at org.apache.hcatalog.templeton.tool.ZooKeeperCleanup.run(ZooKeeperCleanup.java:102)


Logs generated by gateway server after stting logging level to DEBUG:

bash-4.1# java -jar bin/shell.jar
knox:000> import com.jayway.jsonpath.JsonPath
knox:000> import org.apache.hadoop.gateway.shell.Hadoop
knox:000> import org.apache.hadoop.gateway.shell.hdfs.Hdfs
knox:000> import org.apache.hadoop.gateway.shell.job.Job
knox:000> import static java.util.concurrent.TimeUnit.SECONDS
knox:000> gateway = "https://localhost:8443/gateway/sample"
knox:000> username = "mapred"
knox:000> password = "mapred-password"
knox:000> dataFile = "LICENSE"
knox:000> jarFile = "samples/hadoop-examples.jar"
knox:000>
knox:000>
knox:000> hadoop = Hadoop.login( gateway, username, password )
knox:000> println "Delete /tmp/test " + Hdfs.rm(hadoop).file("/tmp/test").recursive().now().statusCode
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: DELETE http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&recursive=true&op=DELETE
Delete /tmp/test 200
knox:000> println "Create /tmp/test " + Hdfs.mkdir(hadoop).dir( "/tmp/test").now().statusCode
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&op=MKDIRS
Create /tmp/test 200
knox:000> putData = Hdfs.put(hadoop).file( dataFile ).to( "/tmp/test/input/FILE" ).later() {
knox:001> println "Put /tmp/test/input/FILE " + it.statusCode }
knox:000> putJar = Hdfs.put(hadoop).file( jarFile ).to( "/tmp/test/hadoop-examples.jar" ).later() {
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&op=CREATE
knox:001> println "Put /tmp/test/hadoop-examples.jar " + it.statusCode }
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50075/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&overwrite=false&op=CREATE
knox:000> hadoop.waitFor( putData, putJar )
Put /tmp/test/input/FILE 201
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&op=CREATE
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50075/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&overwrite=false&op=CREATE
Put /tmp/test/hadoop-examples.jar 201
knox:000> jobId = Job.submitJava(hadoop).jar( "/tmp/test/hadoop-examples.jar" ).app( "wordcount" ).input( "/tmp/test/input" ).output( "/tmp/test/output" ).now().jobId
13/10/07 06:39:20 DEBUG hadoop.gateway: Dispatching request: POST http://localhost:50111/templeton/v1/mapreduce/jar
ERROR org.apache.hadoop.gateway.shell.HadoopException:
org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
        at org.apache.hadoop.gateway.shell.AbstractRequest.now (AbstractRequest.java:72)
        at org.apache.hadoop.gateway.shell.AbstractRequest$now.call (Unknown Source)
        at groovysh_evaluate.run (groovysh_evaluate:8)
        ...
knox:000> println "Submitted job " + jobId
ERROR groovy.lang.MissingPropertyException:
No such property: jobId for class: groovysh_evaluate
        at groovysh_evaluate.run (groovysh_evaluate:8)
        ...
knox:000> done = false
knox:000> count = 0
knox:000> while( !done && count++ < 60 ) {
knox:001> sleep( 1000 )
knox:002> json = Job.queryStatus(hadoop).jobId(jobId).now().string
knox:003> done = JsonPath.read( json, "${SDS}.status.jobComplete" )
knox:004> }
ERROR groovy.lang.MissingPropertyException:
No such property: jobId for class: groovysh_evaluate
        at groovysh_evaluate.run (groovysh_evaluate:10)
        ...
knox:004> println "Done " + done
knox:005> println "Shutdown " + hadoop.shutdown( 10, SECONDS )
knox:006> exit


> Server error while submitting mapreduce job with knox-0.2.0
> -----------------------------------------------------------
>
>                 Key: KNOX-180
>                 URL: https://issues.apache.org/jira/browse/KNOX-180
>             Project: Apache Knox
>          Issue Type: Bug
>          Components: Server
>         Environment: knox-0.2.0
>            Reporter: Amit Kamble
>
> I am using HDP 1.3.2 and knox-0.2.0
> I am trying to run sample mapreduce job (wordcount) provided with knox-0.2.0 distribution (i.e. hadoop-examples.jar) using groovy. But while submitting its giving an error:
> DEBUG hadoop.gateway: Dispatching request: DELETE http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&recursive=true&op=DELETE
> Delete /tmp/test 200
> DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&op=MKDIRS
> Create /tmp/test 200
> DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&op=CREATE
> DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50070/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&op=CREATE
> DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50075/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&overwrite=false&op=CREATE
> DEBUG hadoop.gateway: Dispatching request: PUT http://localhost:50075/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&overwrite=false&op=CREATE
> Put /tmp/test/hadoop-examples.jar 201
> Put /tmp/test/input/FILE 201
> DEBUG hadoop.gateway: Dispatching request: POST http://localhost:50111/templeton/v1/mapreduce/jar
> Caught: org.apache.hadoop.gateway.shell.HadoopException: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
> org.apache.hadoop.gateway.shell.HadoopException: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>         at org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
>         at org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown Source)
>         at ExampleSubmitJob.run(ExampleSubmitJob.groovy:42)
>         at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
>         at org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
>         at org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
>         at org.apache.hadoop.gateway.launcher.Command.run(Command.java:100)
>         at org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
>         at org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
>         at org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
>         at org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
>         at org.apache.hadoop.gateway.shell.job.Java$Request.access$100(Java.java:38)
>         at org.apache.hadoop.gateway.shell.job.Java$Request$1.call(Java.java:82)
>         at org.apache.hadoop.gateway.shell.job.Java$Request$1.call(Java.java:70)
>         at org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
>         ... 8 more
> In case of Oozie workflow, its working properly and submitting workflow successfully. In my case, its working fine with WebHDFS and Oozie but not with WebHCat/Templeton.
> Could you please suggest if I missed any configuration setting with this.



--
This message was sent by Atlassian JIRA
(v6.1#6144)