You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Alex Rodrigues <ar...@actificent.com> on 2017/08/15 18:45:54 UTC

Livy on OpenJDK 8

Hi Livyers,

I'm hitting a rather nasty error when attempting to initialize a session.

Livy is running on Docker
Livy branch 0.3.0 is compiled with Scala 2.10
Spark 2.1.1 compiled with Scala 2.10
Spark Master = local[*]


If running with OpenJDK 1.8.0_141, the server spills this to the log:

17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$: Creating
LivyClient for sessionId: 6
17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
394f110eb785, resolves to a loopback address, but we couldn't find any
external IP address!
17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
livy.rsc.rpc.server.address if you need to bind to another address.
17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
Registering new session 6
17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN No
appenders could be found for logger (com.cloudera.livy.rsc.driver.
RSCDriver).
17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
Please initialize the log4j system properly.
17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded message
of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
Handling SASL challenge message...
17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
Sending SASL challenge response...
17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded message
of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded message
of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
Handling SASL challenge message...
17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
Sending SASL challenge response...
17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded message
of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler: SASL
negotiation finished with QOP auth.
17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC client
connected from [id: 0xd53993c6, /10.0.1.5:60310 => /10.0.1.5:39450].
17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded message
of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded message
of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress (73 bytes)
17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
[RegistrationHandler] Received RPC message: type=CALL id=0
payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received driver
info for client [id: 0xd53993c6, /10.0.1.5:60310 => /10.0.1.5:39450]:
394f110eb785/35784.
17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded message
of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded message
of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded message
of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception in
thread "main" java.lang.VerifyError: Bad type on operand stack*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
Details:*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Location:*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
@7: invokevirtual*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
assignable to 'scala/tools/nsc/interpreter/ILoop'*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
Frame:*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci: @7*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags: { }*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     locals: {
'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
*17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack: {
'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:   Bytecode:*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:     0x0000000:
b201 052a b600 52b6 0109 bb00 2359 2a2b*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:     0x0000010:
2c2d 1904 b701 63b6 0166 57b1*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
java.lang.reflect.Method.invoke(Method.java:498)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
*17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect to
context.
javax.security.sasl.SaslException: Client closed before SASL negotiation
finished.
at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(
SaslHandler.java:92)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(
AbstractChannelHandlerContext.java:208)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(
AbstractChannelHandlerContext.java:194)
at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(
ChannelInboundHandlerAdapter.java:75)
at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(
KryoMessageCodec.java:104)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(
AbstractChannelHandlerContext.java:208)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(
AbstractChannelHandlerContext.java:194)
at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(
ChannelInboundHandlerAdapter.java:75)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(
AbstractChannelHandlerContext.java:208)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(
AbstractChannelHandlerContext.java:194)
at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
DefaultChannelPipeline.java:828)
at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(
AbstractChannel.java:621)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(
SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.
run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:748)
17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending job
5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
InteractiveSession 6 session state change from starting to error
17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO InteractiveSession:
Stopping InteractiveSession 6...


With Hotspot (1.8.0_144), it runs fine.

It seems to be JDK-8077810
<https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great if
someone could confirm this behaviour and if this is a known Livy problem.

What JVM do you use to run Livy?


Many thanks,
Alex

Re: Livy on OpenJDK 8

Posted by Saisai Shao <sa...@gmail.com>.
"x" is just a hint, you should change a right Scala version of Spark
2.1.1...

Anyway checking from this command is not useful. You should check Livy log
to see if right repl jar is added to Spark's runtime.

I think you should make sure everything is correct before thinking about
JVM issue.

On Wed, Aug 16, 2017 at 5:41 PM, Alex Rodrigues <ar...@actificent.com>
wrote:

> Added
>
> livy.spark.scala-version = "2.10.x"
>
> to livy.conf. Same result.
>
> The command looks the same:
>
> /docker-java-home/bin/java -cp /apps/spark-2.1.1_2.10/conf/:/
> apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf -Xmx1g
> org.apache.spark.deploy.SparkSubmit --properties-file /tmp/
> livyConf3184771000880425953.properties --class
> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal
>
>
> On Wed, Aug 16, 2017 at 10:33 AM, Saisai Shao <sa...@gmail.com>
> wrote:
>
>> Did you set "livy.spark.scala-version", for Spark 2.1.1 if you don't set
>> this configuration, Livy will use repl_2.11 jars by default, since default
>> scala version of Spark 2.1.1 is 2.11, so you have to explicitly specify "
>> livy.spark.scala-version" to "2.10.x".
>>
>> On Wed, Aug 16, 2017 at 5:21 PM, Alex Rodrigues <
>> arodrigues@actificent.com> wrote:
>>
>>> Hi
>>>
>>>
>>> Thanks for your response.
>>>
>>> The command executed by LivyServer when initializing a session is:
>>>
>>> /docker-java-home/bin/java -cp /apps/spark-2.1.1_2.10/conf/:/
>>> apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf -Xmx1g
>>> org.apache.spark.deploy.SparkSubmit --properties-file
>>> /tmp/livyConf5628211771459249745.properties --class
>>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal
>>>
>>>
>>> Livy server command:
>>>
>>>  /docker-java-home/bin/java -cp /apps/livy/jars/*:/apps/livy/conf:/etc/hadoop/conf:
>>> com.cloudera.livy.server.LivyServer
>>>
>>> Please see attached the list of files in /apps/spark-2.1.1_2.10/jars/*,
>>> and other with Livy jars (present in /apps/livy/jars/*).
>>>
>>> The behaviour seems pretty consistent and issue with Hotspot JVM cannot
>>> be reproduced.
>>>
>>>
>>> On Wed, Aug 16, 2017 at 9:51 AM, Saisai Shao <sa...@gmail.com>
>>> wrote:
>>>
>>>> Does it fail constantly? Can you please check if Livy pick the right
>>>> repl jar (2.10) for Spark 2.1.1 + Scala 2.1.0?
>>>>
>>>> On Wed, Aug 16, 2017 at 2:45 AM, Alex Rodrigues <
>>>> arodrigues@actificent.com> wrote:
>>>>
>>>>> Hi Livyers,
>>>>>
>>>>> I'm hitting a rather nasty error when attempting to initialize a
>>>>> session.
>>>>>
>>>>> Livy is running on Docker
>>>>> Livy branch 0.3.0 is compiled with Scala 2.10
>>>>> Spark 2.1.1 compiled with Scala 2.10
>>>>> Spark Master = local[*]
>>>>>
>>>>>
>>>>> If running with OpenJDK 1.8.0_141, the server spills this to the log:
>>>>>
>>>>> 17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$:
>>>>> Creating LivyClient for sessionId: 6
>>>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
>>>>> 394f110eb785, resolves to a loopback address, but we couldn't find any
>>>>> external IP address!
>>>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
>>>>> livy.rsc.rpc.server.address if you need to bind to another address.
>>>>> 17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
>>>>> JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
>>>>> 17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
>>>>> Registering new session 6
>>>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>>>> No appenders could be found for logger (com.cloudera.livy.rsc.driver.
>>>>> RSCDriver).
>>>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>>>> Please initialize the log4j system properly.
>>>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>>>> See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more
>>>>> info.
>>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>>> Handling SASL challenge message...
>>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>>> Sending SASL challenge response...
>>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
>>>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
>>>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>>> Handling SASL challenge message...
>>>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>>> Sending SASL challenge response...
>>>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
>>>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>>> SASL negotiation finished with QOP auth.
>>>>> 17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC
>>>>> client connected from [id: 0xd53993c6, /10.0.1.5:60310 => /
>>>>> 10.0.1.5:39450].
>>>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>>> message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>>>> (73 bytes)
>>>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
>>>>> [RegistrationHandler] Received RPC message: type=CALL id=0
>>>>> payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received
>>>>> driver info for client [id: 0xd53993c6, /10.0.1.5:60310 => /
>>>>> 10.0.1.5:39450]: 394f110eb785/35784.
>>>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>>>> 17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
>>>>> 17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded
>>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>>>>> in thread "main" java.lang.VerifyError: Bad type on operand stack*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>>>>> Details:*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>>>>> Location:*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>>>>> com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
>>>>> @7: invokevirtual*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
>>>>> 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
>>>>> assignable to 'scala/tools/nsc/interpreter/ILoop'*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
>>>>> Frame:*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci:
>>>>> @7*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags:
>>>>> { }*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>>>>> locals: { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
>>>>> 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
>>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack:
>>>>> { 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop'
>>>>> }*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>>>> Bytecode:*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>>>> 0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>>>> 0x0000010: 2c2d 1904 b701 63b6 0166 57b1*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> java.lang.reflect.Method.invoke(Method.java:498)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
>>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
>>>>> 17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect
>>>>> to context.
>>>>> javax.security.sasl.SaslException: Client closed before SASL
>>>>> negotiation finished.
>>>>> at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.
>>>>> java:416)
>>>>> at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHa
>>>>> ndler.java:92)
>>>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>>>> Inactive(AbstractChannelHandlerContext.java:208)
>>>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>>>> active(AbstractChannelHandlerContext.java:194)
>>>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>>>> e(ChannelInboundHandlerAdapter.java:75)
>>>>> at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(K
>>>>> ryoMessageCodec.java:104)
>>>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>>>> Inactive(AbstractChannelHandlerContext.java:208)
>>>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>>>> active(AbstractChannelHandlerContext.java:194)
>>>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>>>> e(ChannelInboundHandlerAdapter.java:75)
>>>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>>>> Inactive(AbstractChannelHandlerContext.java:208)
>>>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>>>> active(AbstractChannelHandlerContext.java:194)
>>>>> at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
>>>>> DefaultChannelPipeline.java:828)
>>>>> at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(Abstra
>>>>> ctChannel.java:621)
>>>>> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTas
>>>>> ks(SingleThreadEventExecutor.java:357)
>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(Sin
>>>>> gleThreadEventExecutor.java:111)
>>>>> at java.lang.Thread.run(Thread.java:748)
>>>>> 17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending
>>>>> job 5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
>>>>> 17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
>>>>> InteractiveSession 6 session state change from starting to error
>>>>> 17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO
>>>>> InteractiveSession: Stopping InteractiveSession 6...
>>>>>
>>>>>
>>>>> With Hotspot (1.8.0_144), it runs fine.
>>>>>
>>>>> It seems to be JDK-8077810
>>>>> <https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great
>>>>> if someone could confirm this behaviour and if this is a known Livy problem.
>>>>>
>>>>> What JVM do you use to run Livy?
>>>>>
>>>>>
>>>>> Many thanks,
>>>>> Alex
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Livy on OpenJDK 8

Posted by Alex Rodrigues <ar...@actificent.com>.
Added

livy.spark.scala-version = "2.10.x"

to livy.conf. Same result.

The command looks the same:

/docker-java-home/bin/java -cp
/apps/spark-2.1.1_2.10/conf/:/apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf
-Xmx1g org.apache.spark.deploy.SparkSubmit --properties-file
/tmp/livyConf3184771000880425953.properties --class
com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal


On Wed, Aug 16, 2017 at 10:33 AM, Saisai Shao <sa...@gmail.com>
wrote:

> Did you set "livy.spark.scala-version", for Spark 2.1.1 if you don't set
> this configuration, Livy will use repl_2.11 jars by default, since default
> scala version of Spark 2.1.1 is 2.11, so you have to explicitly specify "
> livy.spark.scala-version" to "2.10.x".
>
> On Wed, Aug 16, 2017 at 5:21 PM, Alex Rodrigues <arodrigues@actificent.com
> > wrote:
>
>> Hi
>>
>>
>> Thanks for your response.
>>
>> The command executed by LivyServer when initializing a session is:
>>
>> /docker-java-home/bin/java -cp /apps/spark-2.1.1_2.10/conf/:/
>> apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf -Xmx1g
>> org.apache.spark.deploy.SparkSubmit --properties-file
>> /tmp/livyConf5628211771459249745.properties --class
>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal
>>
>>
>> Livy server command:
>>
>>  /docker-java-home/bin/java -cp /apps/livy/jars/*:/apps/livy/conf:/etc/hadoop/conf:
>> com.cloudera.livy.server.LivyServer
>>
>> Please see attached the list of files in /apps/spark-2.1.1_2.10/jars/*,
>> and other with Livy jars (present in /apps/livy/jars/*).
>>
>> The behaviour seems pretty consistent and issue with Hotspot JVM cannot
>> be reproduced.
>>
>>
>> On Wed, Aug 16, 2017 at 9:51 AM, Saisai Shao <sa...@gmail.com>
>> wrote:
>>
>>> Does it fail constantly? Can you please check if Livy pick the right
>>> repl jar (2.10) for Spark 2.1.1 + Scala 2.1.0?
>>>
>>> On Wed, Aug 16, 2017 at 2:45 AM, Alex Rodrigues <
>>> arodrigues@actificent.com> wrote:
>>>
>>>> Hi Livyers,
>>>>
>>>> I'm hitting a rather nasty error when attempting to initialize a
>>>> session.
>>>>
>>>> Livy is running on Docker
>>>> Livy branch 0.3.0 is compiled with Scala 2.10
>>>> Spark 2.1.1 compiled with Scala 2.10
>>>> Spark Master = local[*]
>>>>
>>>>
>>>> If running with OpenJDK 1.8.0_141, the server spills this to the log:
>>>>
>>>> 17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$:
>>>> Creating LivyClient for sessionId: 6
>>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
>>>> 394f110eb785, resolves to a loopback address, but we couldn't find any
>>>> external IP address!
>>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
>>>> livy.rsc.rpc.server.address if you need to bind to another address.
>>>> 17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
>>>> JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
>>>> 17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
>>>> Registering new session 6
>>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>>> No appenders could be found for logger (com.cloudera.livy.rsc.driver.
>>>> RSCDriver).
>>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>>> Please initialize the log4j system properly.
>>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>>> See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more
>>>> info.
>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>> Handling SASL challenge message...
>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>> Sending SASL challenge response...
>>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
>>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
>>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>> Handling SASL challenge message...
>>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>> Sending SASL challenge response...
>>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
>>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>>> SASL negotiation finished with QOP auth.
>>>> 17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC
>>>> client connected from [id: 0xd53993c6, /10.0.1.5:60310 => /
>>>> 10.0.1.5:39450].
>>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>>> message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>>> (73 bytes)
>>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
>>>> [RegistrationHandler] Received RPC message: type=CALL id=0
>>>> payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received
>>>> driver info for client [id: 0xd53993c6, /10.0.1.5:60310 => /
>>>> 10.0.1.5:39450]: 394f110eb785/35784.
>>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>>> 17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
>>>> 17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded
>>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>>>> in thread "main" java.lang.VerifyError: Bad type on operand stack*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>>>> Details:*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Location:*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>>>> com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
>>>> @7: invokevirtual*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
>>>> 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
>>>> assignable to 'scala/tools/nsc/interpreter/ILoop'*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
>>>> Frame:*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci: @7*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags:
>>>> { }*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     locals:
>>>> { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
>>>> 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
>>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack:
>>>> { 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop'
>>>> }*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:   Bytecode:*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>>> 0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>>> 0x0000010: 2c2d 1904 b701 63b6 0166 57b1*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> java.lang.reflect.Method.invoke(Method.java:498)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
>>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
>>>> 17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect
>>>> to context.
>>>> javax.security.sasl.SaslException: Client closed before SASL
>>>> negotiation finished.
>>>> at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.
>>>> java:416)
>>>> at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHa
>>>> ndler.java:92)
>>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>>> Inactive(AbstractChannelHandlerContext.java:208)
>>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>>> active(AbstractChannelHandlerContext.java:194)
>>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>>> e(ChannelInboundHandlerAdapter.java:75)
>>>> at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(K
>>>> ryoMessageCodec.java:104)
>>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>>> Inactive(AbstractChannelHandlerContext.java:208)
>>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>>> active(AbstractChannelHandlerContext.java:194)
>>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>>> e(ChannelInboundHandlerAdapter.java:75)
>>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>>> Inactive(AbstractChannelHandlerContext.java:208)
>>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>>> active(AbstractChannelHandlerContext.java:194)
>>>> at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
>>>> DefaultChannelPipeline.java:828)
>>>> at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(Abstra
>>>> ctChannel.java:621)
>>>> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTas
>>>> ks(SingleThreadEventExecutor.java:357)
>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(Sin
>>>> gleThreadEventExecutor.java:111)
>>>> at java.lang.Thread.run(Thread.java:748)
>>>> 17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending job
>>>> 5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
>>>> 17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
>>>> InteractiveSession 6 session state change from starting to error
>>>> 17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO InteractiveSession:
>>>> Stopping InteractiveSession 6...
>>>>
>>>>
>>>> With Hotspot (1.8.0_144), it runs fine.
>>>>
>>>> It seems to be JDK-8077810
>>>> <https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great
>>>> if someone could confirm this behaviour and if this is a known Livy problem.
>>>>
>>>> What JVM do you use to run Livy?
>>>>
>>>>
>>>> Many thanks,
>>>> Alex
>>>>
>>>>
>>>>
>>>
>>
>

Re: Livy on OpenJDK 8

Posted by Saisai Shao <sa...@gmail.com>.
Did you set "livy.spark.scala-version", for Spark 2.1.1 if you don't set
this configuration, Livy will use repl_2.11 jars by default, since default
scala version of Spark 2.1.1 is 2.11, so you have to explicitly specify "
livy.spark.scala-version" to "2.10.x".

On Wed, Aug 16, 2017 at 5:21 PM, Alex Rodrigues <ar...@actificent.com>
wrote:

> Hi
>
>
> Thanks for your response.
>
> The command executed by LivyServer when initializing a session is:
>
> /docker-java-home/bin/java -cp /apps/spark-2.1.1_2.10/conf/:/
> apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf -Xmx1g
> org.apache.spark.deploy.SparkSubmit --properties-file /tmp/
> livyConf5628211771459249745.properties --class
> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal
>
>
> Livy server command:
>
>  /docker-java-home/bin/java -cp /apps/livy/jars/*:/apps/livy/conf:/etc/hadoop/conf:
> com.cloudera.livy.server.LivyServer
>
> Please see attached the list of files in /apps/spark-2.1.1_2.10/jars/*,
> and other with Livy jars (present in /apps/livy/jars/*).
>
> The behaviour seems pretty consistent and issue with Hotspot JVM cannot be
> reproduced.
>
>
> On Wed, Aug 16, 2017 at 9:51 AM, Saisai Shao <sa...@gmail.com>
> wrote:
>
>> Does it fail constantly? Can you please check if Livy pick the right repl
>> jar (2.10) for Spark 2.1.1 + Scala 2.1.0?
>>
>> On Wed, Aug 16, 2017 at 2:45 AM, Alex Rodrigues <
>> arodrigues@actificent.com> wrote:
>>
>>> Hi Livyers,
>>>
>>> I'm hitting a rather nasty error when attempting to initialize a session.
>>>
>>> Livy is running on Docker
>>> Livy branch 0.3.0 is compiled with Scala 2.10
>>> Spark 2.1.1 compiled with Scala 2.10
>>> Spark Master = local[*]
>>>
>>>
>>> If running with OpenJDK 1.8.0_141, the server spills this to the log:
>>>
>>> 17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$:
>>> Creating LivyClient for sessionId: 6
>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
>>> 394f110eb785, resolves to a loopback address, but we couldn't find any
>>> external IP address!
>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
>>> livy.rsc.rpc.server.address if you need to bind to another address.
>>> 17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
>>> JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
>>> 17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
>>> Registering new session 6
>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN No
>>> appenders could be found for logger (com.cloudera.livy.rsc.driver.
>>> RSCDriver).
>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>> Please initialize the log4j system properly.
>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>> See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Handling SASL challenge message...
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Sending SASL challenge response...
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Handling SASL challenge message...
>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Sending SASL challenge response...
>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> SASL negotiation finished with QOP auth.
>>> 17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC
>>> client connected from [id: 0xd53993c6, /10.0.1.5:60310 => /
>>> 10.0.1.5:39450].
>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>> (73 bytes)
>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
>>> [RegistrationHandler] Received RPC message: type=CALL id=0
>>> payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received
>>> driver info for client [id: 0xd53993c6, /10.0.1.5:60310 => /
>>> 10.0.1.5:39450]: 394f110eb785/35784.
>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>> 17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
>>> 17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception in
>>> thread "main" java.lang.VerifyError: Bad type on operand stack*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>>> Details:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Location:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>>> com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
>>> @7: invokevirtual*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
>>> 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
>>> assignable to 'scala/tools/nsc/interpreter/ILoop'*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
>>> Frame:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci: @7*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags: {
>>> }*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     locals:
>>> { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
>>> 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack: {
>>> 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:   Bytecode:*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>> 0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>> 0x0000010: 2c2d 1904 b701 63b6 0166 57b1*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> java.lang.reflect.Method.invoke(Method.java:498)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
>>> 17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect
>>> to context.
>>> javax.security.sasl.SaslException: Client closed before SASL
>>> negotiation finished.
>>> at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
>>> at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHa
>>> ndler.java:92)
>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>> Inactive(AbstractChannelHandlerContext.java:208)
>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>> active(AbstractChannelHandlerContext.java:194)
>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>> e(ChannelInboundHandlerAdapter.java:75)
>>> at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(K
>>> ryoMessageCodec.java:104)
>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>> Inactive(AbstractChannelHandlerContext.java:208)
>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>> active(AbstractChannelHandlerContext.java:194)
>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>> e(ChannelInboundHandlerAdapter.java:75)
>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>> Inactive(AbstractChannelHandlerContext.java:208)
>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>> active(AbstractChannelHandlerContext.java:194)
>>> at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
>>> DefaultChannelPipeline.java:828)
>>> at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(Abstra
>>> ctChannel.java:621)
>>> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTas
>>> ks(SingleThreadEventExecutor.java:357)
>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(Sin
>>> gleThreadEventExecutor.java:111)
>>> at java.lang.Thread.run(Thread.java:748)
>>> 17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending job
>>> 5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
>>> 17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
>>> InteractiveSession 6 session state change from starting to error
>>> 17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO InteractiveSession:
>>> Stopping InteractiveSession 6...
>>>
>>>
>>> With Hotspot (1.8.0_144), it runs fine.
>>>
>>> It seems to be JDK-8077810
>>> <https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great
>>> if someone could confirm this behaviour and if this is a known Livy problem.
>>>
>>> What JVM do you use to run Livy?
>>>
>>>
>>> Many thanks,
>>> Alex
>>>
>>>
>>>
>>
>

Re: Livy on OpenJDK 8

Posted by Alex Rodrigues <ar...@actificent.com>.
Hi


Thanks for your response.

The command executed by LivyServer when initializing a session is:

/docker-java-home/bin/java -cp
/apps/spark-2.1.1_2.10/conf/:/apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf
-Xmx1g org.apache.spark.deploy.SparkSubmit --properties-file
/tmp/livyConf5628211771459249745.properties --class
com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal


Livy server command:

 /docker-java-home/bin/java -cp
/apps/livy/jars/*:/apps/livy/conf:/etc/hadoop/conf:
com.cloudera.livy.server.LivyServer

Please see attached the list of files in /apps/spark-2.1.1_2.10/jars/*, and
other with Livy jars (present in /apps/livy/jars/*).

The behaviour seems pretty consistent and issue with Hotspot JVM cannot be
reproduced.


On Wed, Aug 16, 2017 at 9:51 AM, Saisai Shao <sa...@gmail.com> wrote:

> Does it fail constantly? Can you please check if Livy pick the right repl
> jar (2.10) for Spark 2.1.1 + Scala 2.1.0?
>
> On Wed, Aug 16, 2017 at 2:45 AM, Alex Rodrigues <arodrigues@actificent.com
> > wrote:
>
>> Hi Livyers,
>>
>> I'm hitting a rather nasty error when attempting to initialize a session.
>>
>> Livy is running on Docker
>> Livy branch 0.3.0 is compiled with Scala 2.10
>> Spark 2.1.1 compiled with Scala 2.10
>> Spark Master = local[*]
>>
>>
>> If running with OpenJDK 1.8.0_141, the server spills this to the log:
>>
>> 17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$: Creating
>> LivyClient for sessionId: 6
>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
>> 394f110eb785, resolves to a loopback address, but we couldn't find any
>> external IP address!
>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
>> livy.rsc.rpc.server.address if you need to bind to another address.
>> 17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
>> JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
>> 17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
>> Registering new session 6
>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN No
>> appenders could be found for logger (com.cloudera.livy.rsc.driver.
>> RSCDriver).
>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>> Please initialize the log4j system properly.
>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN See
>> http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>> Handling SASL challenge message...
>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>> Sending SASL challenge response...
>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>> Handling SASL challenge message...
>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>> Sending SASL challenge response...
>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>> SASL negotiation finished with QOP auth.
>> 17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC client
>> connected from [id: 0xd53993c6, /10.0.1.5:60310 => /10.0.1.5:39450].
>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>> message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>> (73 bytes)
>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
>> [RegistrationHandler] Received RPC message: type=CALL id=0
>> payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received
>> driver info for client [id: 0xd53993c6, /10.0.1.5:60310 => /
>> 10.0.1.5:39450]: 394f110eb785/35784.
>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>> 17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
>> 17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded
>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception in
>> thread "main" java.lang.VerifyError: Bad type on operand stack*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>> Details:*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Location:*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>> com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
>> @7: invokevirtual*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
>> 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
>> assignable to 'scala/tools/nsc/interpreter/ILoop'*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
>> Frame:*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci: @7*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags: {
>> }*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     locals: {
>> 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
>> 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack: {
>> 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:   Bytecode:*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>> 0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>> 0x0000010: 2c2d 1904 b701 63b6 0166 57b1*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> java.lang.reflect.Method.invoke(Method.java:498)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
>> 17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect to
>> context.
>> javax.security.sasl.SaslException: Client closed before SASL negotiation
>> finished.
>> at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
>> at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHa
>> ndler.java:92)
>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>> Inactive(AbstractChannelHandlerContext.java:208)
>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>> active(AbstractChannelHandlerContext.java:194)
>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>> e(ChannelInboundHandlerAdapter.java:75)
>> at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(K
>> ryoMessageCodec.java:104)
>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>> Inactive(AbstractChannelHandlerContext.java:208)
>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>> active(AbstractChannelHandlerContext.java:194)
>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>> e(ChannelInboundHandlerAdapter.java:75)
>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>> Inactive(AbstractChannelHandlerContext.java:208)
>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>> active(AbstractChannelHandlerContext.java:194)
>> at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
>> DefaultChannelPipeline.java:828)
>> at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(Abstra
>> ctChannel.java:621)
>> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTas
>> ks(SingleThreadEventExecutor.java:357)
>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(Sin
>> gleThreadEventExecutor.java:111)
>> at java.lang.Thread.run(Thread.java:748)
>> 17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending job
>> 5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
>> 17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
>> InteractiveSession 6 session state change from starting to error
>> 17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO InteractiveSession:
>> Stopping InteractiveSession 6...
>>
>>
>> With Hotspot (1.8.0_144), it runs fine.
>>
>> It seems to be JDK-8077810
>> <https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great if
>> someone could confirm this behaviour and if this is a known Livy problem.
>>
>> What JVM do you use to run Livy?
>>
>>
>> Many thanks,
>> Alex
>>
>>
>>
>

Re: Livy on OpenJDK 8

Posted by Saisai Shao <sa...@gmail.com>.
Does it fail constantly? Can you please check if Livy pick the right repl
jar (2.10) for Spark 2.1.1 + Scala 2.1.0?

On Wed, Aug 16, 2017 at 2:45 AM, Alex Rodrigues <ar...@actificent.com>
wrote:

> Hi Livyers,
>
> I'm hitting a rather nasty error when attempting to initialize a session.
>
> Livy is running on Docker
> Livy branch 0.3.0 is compiled with Scala 2.10
> Spark 2.1.1 compiled with Scala 2.10
> Spark Master = local[*]
>
>
> If running with OpenJDK 1.8.0_141, the server spills this to the log:
>
> 17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$: Creating
> LivyClient for sessionId: 6
> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
> 394f110eb785, resolves to a loopback address, but we couldn't find any
> external IP address!
> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
> livy.rsc.rpc.server.address if you need to bind to another address.
> 17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
> JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
> 17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
> Registering new session 6
> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN No
> appenders could be found for logger (com.cloudera.livy.rsc.driver.
> RSCDriver).
> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
> Please initialize the log4j system properly.
> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN See
> http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
> Handling SASL challenge message...
> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
> Sending SASL challenge response...
> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
> Handling SASL challenge message...
> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
> Sending SASL challenge response...
> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
> SASL negotiation finished with QOP auth.
> 17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC client
> connected from [id: 0xd53993c6, /10.0.1.5:60310 => /10.0.1.5:39450].
> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
> message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
> (73 bytes)
> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
> [RegistrationHandler] Received RPC message: type=CALL id=0
> payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received driver
> info for client [id: 0xd53993c6, /10.0.1.5:60310 => /10.0.1.5:39450]:
> 394f110eb785/35784.
> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
> 17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
> 17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded
> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception in
> thread "main" java.lang.VerifyError: Bad type on operand stack*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
> Details:*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Location:*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
> com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
> @7: invokevirtual*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
> 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
> assignable to 'scala/tools/nsc/interpreter/ILoop'*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
> Frame:*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci: @7*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags: { }*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     locals: {
> 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
> 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack: {
> 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:   Bytecode:*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:     0x0000000:
> b201 052a b600 52b6 0109 bb00 2359 2a2b*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:     0x0000010:
> 2c2d 1904 b701 63b6 0166 57b1*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> java.lang.reflect.Method.invoke(Method.java:498)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
> 17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect to
> context.
> javax.security.sasl.SaslException: Client closed before SASL negotiation
> finished.
> at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
> at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHa
> ndler.java:92)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
> Inactive(AbstractChannelHandlerContext.java:208)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
> active(AbstractChannelHandlerContext.java:194)
> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
> e(ChannelInboundHandlerAdapter.java:75)
> at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(K
> ryoMessageCodec.java:104)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
> Inactive(AbstractChannelHandlerContext.java:208)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
> active(AbstractChannelHandlerContext.java:194)
> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
> e(ChannelInboundHandlerAdapter.java:75)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
> Inactive(AbstractChannelHandlerContext.java:208)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
> active(AbstractChannelHandlerContext.java:194)
> at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
> DefaultChannelPipeline.java:828)
> at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(Abstra
> ctChannel.java:621)
> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTas
> ks(SingleThreadEventExecutor.java:357)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(
> SingleThreadEventExecutor.java:111)
> at java.lang.Thread.run(Thread.java:748)
> 17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending job
> 5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
> 17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
> InteractiveSession 6 session state change from starting to error
> 17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO InteractiveSession:
> Stopping InteractiveSession 6...
>
>
> With Hotspot (1.8.0_144), it runs fine.
>
> It seems to be JDK-8077810
> <https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great if
> someone could confirm this behaviour and if this is a known Livy problem.
>
> What JVM do you use to run Livy?
>
>
> Many thanks,
> Alex
>
>
>