You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Junaid Nasir <jn...@an10.io> on 2017/10/25 10:55:47 UTC

Livy with DC/OS Mesos

Hi,
I am trying to run Livy with DC/OS Mesos.it tries to create driver but that
fails with error blow. Is there a way i can provide it with spark-internal file?
I am unable to find what this spark-internal file does and i am unable to find
it anywhere. only thing i found related to this was in ContextLauncher.java line
232 ` launcher.setAppResource("spark-internal");`

Failed to fetch spark-internal                   I1025 10:41:45.202145 27901 fetcher.cpp:531] Fetcher Info: {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14","items":[{"action":"BYPASS_CACHE","uri":{"cache":false,"extract":true,"value":"spark-internal"}}],"sandbox_directory":"\/var\/lib\/mesos\/slave\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14\/frameworks\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-0003\/executors\/driver-20171025104145-0034\/runs\/9976b5de-0ed7-428e-a23c-59056c6f29ed"}I1025 10:41:45.204314 27901 fetcher.cpp:442] Fetching URI 'spark-internal'I1025 10:41:45.204329 27901 fetcher.cpp:283] Fetching directly into the sandbox directoryI1025 10:41:45.204346 27901 fetcher.cpp:220] Fetching URI 'spark-internal'Failed to fetch 'spark-internal': A relative path was passed for the resource but the Mesos framework home was not specified. Please either provide this config option or avoid using a relative path
                

Not using Mixmax yet?  

Spark-submit works fine I am unable to run a job with following command
bin/spark-submit --class org.apache.spark.examples.SparkPi --master
mesos://10.128.0.23:22401 --deploy-mode cluster --supervise --executor-memory 1G
--total-executor-cores 1 --conf
spark.mesos.executor.docker.image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
--conf spark.executor.home=/opt/spark/dist
https://downloads.mesosphere.com/spark/examples/pi.py 30
I have set spark master to the same ip and set cluster mode. sending post
request to session with these parametersdata = {'kind':
'spark','conf':{'spark.mesos.executor.docker.image':'mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
','spark.executor.home':'/opt/spark/dist'}} 
Any help would be highlyappreciated

Re: Livy with DC/OS Mesos

Posted by Junaid Nasir <jn...@an10.io>.
I am able to run spark shell from command line using command
/spark/dist/bin/pyspark --master mesos://leader.mesos:5050 --conf
spark.mesos.executor.docker.image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
--conf spark.executor.home=/opt/spark/dist
I have changed livy's configuration accordingly i.e set master to leader.mesos
when i send post request to /sessions with input data.{'kind': 'spark', 'conf':
{'spark.mesos.executor.docker.image':
'mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6', 'spark.executor.home':
'/opt/spark/dist'}}it produces these errors on livy console

                  17/10/25 13:00:00 INFO InteractiveSession$: Creating Interactive session 0: [owner: null, request: [kind: spark, proxyUser: None, conf: spark.mesos.executor.docker.image -> mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6,spark.executor.home -> /opt/spark/dist, heartbeatTimeoutInSecond: 0]]17/10/25 13:00:00 INFO RpcServer: Connected to the port 1000017/10/25 13:00:00 WARN RSCConf: Your hostname, agent0005.c.calm-airship-162807.internal, resolves to a loopback address, but we couldn't find any external IP address!17/10/25 13:00:00 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.17/10/25 13:00:00 INFO LineBufferedStream: stdout: spark-env: User: root17/10/25 13:00:00 INFO LineBufferedStream: stdout: Skipping bootstrap IP detection17/10/25 13:00:00 INFO LineBufferedStream: stdout: spark-env: No kerberos KDC config found17/10/25 13:00:00 INFO InteractiveSessionManager: Registering new session 017/10/25 13:00:01 INFO LineBufferedStream: stdout: 17/10/25 13:00:01 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable17/10/25 13:00:01 INFO LineBufferedStream: stdout: 17/10/25 13:00:01 INFO RSCDriver: Connecting to: agent0005.c.calm-airship-162807.internal:1000017/10/25 13:00:01 INFO LineBufferedStream: stdout: 17/10/25 13:00:01 INFO RSCDriver: Starting RPC server...17/10/25 13:00:01 INFO LineBufferedStream: stdout: 17/10/25 13:00:01 INFO RpcServer: Connected to the port 1000117/10/25 13:00:01 INFO LineBufferedStream: stdout: 17/10/25 13:00:01 WARN RSCConf: Your hostname, agent0005.c.calm-airship-162807.internal, resolves to a loopback address, but we couldn't find any external IP address!17/10/25 13:00:01 INFO LineBufferedStream: stdout: 17/10/25 13:00:01 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.17/10/25 13:00:02 INFO LineBufferedStream: stdout: 17/10/25 13:00:02 INFO RSCDriver: Received job request fcbdd93f-fcd7-453d-9d0f-c303a687c7ae17/10/25 13:00:02 INFO LineBufferedStream: stdout: 17/10/25 13:00:02 INFO RSCDriver: SparkContext not yet up, queueing job request.17/10/25 13:00:04 WARN RSCClient: Client RPC channel closed unexpectedly.17/10/25 13:00:04 WARN ContextLauncher: Child process exited with code 137.17/10/25 13:00:04 WARN RSCClient: Error stopping RPC.io.netty.util.concurrent.BlockingOperationException: DefaultChannelPromise@1831bad3(uncancellable)        at io.netty.util.concurrent.DefaultPromise.checkDeadLock(DefaultPromise.java:390)        at io.netty.channel.DefaultChannelPromise.checkDeadLock(DefaultChannelPromise.java:157)        at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:251)        at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:129)        at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:28)        at io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:218)        at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:117)        at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:28)        at org.apache.livy.rsc.rpc.Rpc.close(Rpc.java:307)        at org.apache.livy.rsc.RSCClient.stop(RSCClient.java:232)        at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:129)        at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:123)        at org.apache.livy.rsc.Utils$2.operationComplete(Utils.java:108)        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680)        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:567)        at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:406)        at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82)        at io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:956)        at io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:608)        at io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:586)        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71)        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:158)        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)        at java.lang.Thread.run(Thread.java:745)17/10/25 13:00:04 ERROR SparkProcApp: spark-submit exited with code 13717/10/25 13:00:04 INFO RSCClient: Failing pending job fcbdd93f-fcd7-453d-9d0f-c303a687c7ae due to shutdown.17/10/25 13:00:04 INFO InteractiveSession: Failed to ping RSC driver for session 0. Killing application.17/10/25 13:00:04 INFO InteractiveSession: Stopping InteractiveSession 0...17/10/25 13:00:04 INFO InteractiveSession: Stopped InteractiveSession 0.
                

Not using Mixmax yet?  
i noticed the line Set livy.rsc.rpc.server.address if you need to bind to
another address. have tried setting this in conf/livy-client with livy's server
local ip address. but error remains.
any help would be great
Thanks for your time  





On Wed, Oct 25, 2017 5:47 PM, Junaid Nasir jnasir@an10.io  wrote:
livy version 0.4trying to create interactive session. may be it's not supported
with mesos?  





On Wed, Oct 25, 2017 3:55 PM, Junaid Nasir jnasir@an10.io  wrote:
Hi,
I am trying to run Livy with DC/OS Mesos.it tries to create driver but that
fails with error blow. Is there a way i can provide it with spark-internal file?
I am unable to find what this spark-internal file does and i am unable to find
it anywhere. only thing i found related to this was in ContextLauncher.java line
232 ` launcher.setAppResource("spark-internal");`

Failed to fetch spark-internal                   I1025 10:41:45.202145 27901 fetcher.cpp:531] Fetcher Info: {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14","items":[{"action":"BYPASS_CACHE","uri":{"cache":false,"extract":true,"value":"spark-internal"}}],"sandbox_directory":"\/var\/lib\/mesos\/slave\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14\/frameworks\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-0003\/executors\/driver-20171025104145-0034\/runs\/9976b5de-0ed7-428e-a23c-59056c6f29ed"}I1025 10:41:45.204314 27901 fetcher.cpp:442] Fetching URI 'spark-internal'I1025 10:41:45.204329 27901 fetcher.cpp:283] Fetching directly into the sandbox directoryI1025 10:41:45.204346 27901 fetcher.cpp:220] Fetching URI 'spark-internal'Failed to fetch 'spark-internal': A relative path was passed for the resource but the Mesos framework home was not specified. Please either provide this config option or avoid using a relative path
                

Not using Mixmax yet?  

Spark-submit works fine I am unable to run a job with following command
bin/spark-submit --class org.apache.spark.examples.SparkPi --master
mesos://10.128.0.23:22401 --deploy-mode cluster --supervise --executor-memory 1G
--total-executor-cores 1 --conf
spark.mesos.executor.docker.image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
--conf spark.executor.home=/opt/spark/dist
https://downloads.mesosphere.com/spark/examples/pi.py 30
I have set spark master to the same ip and set cluster mode. sending post
request to session with these parametersdata = {'kind':
'spark','conf':{'spark.mesos.executor.docker.image':'mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
','spark.executor.home':'/opt/spark/dist'}} 
Any help would be highlyappreciated

Re: Livy with DC/OS Mesos

Posted by Junaid Nasir <jn...@an10.io>.
thanks Saisai for the reply, can you point to something looking at these
logs?. Am stuck at this problem for quit a long now.I think it's related to rpc
server not being able to talk to livy (or the other way around). am not sure how
that works tbh.  





On Wed, Oct 25, 2017 6:09 PM, Saisai Shao sai.sai.shao@gmail.com  wrote:
Unfortunately Mesos cluster manager is not supported currently for Livy, we only
tested on local and yarn mode.
On Wed, Oct 25, 2017 at 8:47 PM, Junaid Nasir <jn...@an10.io>  wrote:
livy version 0.4trying to create interactive session. may be it's not supported
with mesos?  





On Wed, Oct 25, 2017 3:55 PM, Junaid Nasir jnasir@an10.io  wrote:
Hi,
I am trying to run Livy with DC/OS Mesos.it tries to create driver but that
fails with error blow. Is there a way i can provide it with spark-internal file?
I am unable to find what this spark-internal file does and i am unable to find
it anywhere. only thing i found related to this was in ContextLauncher.java line
232 ` launcher.setAppResource("spark-internal");`

Failed to fetch spark-internal                   I1025 10:41:45.202145 27901 fetcher.cpp:531] Fetcher Info: {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14","items":[{"action":"BYPASS_CACHE","uri":{"cache":false,"extract":true,"value":"spark-internal"}}],"sandbox_directory":"\/var\/lib\/mesos\/slave\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14\/frameworks\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-0003\/executors\/driver-20171025104145-0034\/runs\/9976b5de-0ed7-428e-a23c-59056c6f29ed"}I1025 10:41:45.204314 27901 fetcher.cpp:442] Fetching URI 'spark-internal'I1025 10:41:45.204329 27901 fetcher.cpp:283] Fetching directly into the sandbox directoryI1025 10:41:45.204346 27901 fetcher.cpp:220] Fetching URI 'spark-internal'Failed to fetch 'spark-internal': A relative path was passed for the resource but the Mesos framework home was not specified. Please either provide this config option or avoid using a relative path
                

Not using Mixmax yet?  

Spark-submit works fine I am unable to run a job with following command
bin/spark-submit --class org.apache.spark.examples.SparkPi --master mesos://
10.128.0.23:22401  --deploy-mode cluster --supervise --executor-memory 1G
--total-executor-cores 1 --conf spark.mesos.executor.docker.
image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6 --conf spark.executor.home=/opt/
spark/dist https://downloads.mesosphere.com/spark/examples/pi.py  30
I have set spark master to the same ip and set cluster mode. sending post
request to session with these parametersdata = {'kind':
'spark','conf':{'spark.mesos.executor.docker.image':'
mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6 ','spark.executor.home':'/opt/
spark/dist'}} 
Any help would be highlyappreciated

Re: Livy with DC/OS Mesos

Posted by Saisai Shao <sa...@gmail.com>.
Unfortunately Mesos cluster manager is not supported currently for Livy, we
only tested on local and yarn mode.

On Wed, Oct 25, 2017 at 8:47 PM, Junaid Nasir <jn...@an10.io> wrote:

> livy version 0.4
> trying to create interactive session. may be it's not supported with mesos?
>
>
>
> On Wed, Oct 25, 2017 3:55 PM, Junaid Nasir jnasir@an10.io wrote:
>
>> Hi,
>>
>> I am trying to run Livy with DC/OS Mesos.
>> it tries to create driver but that fails with error blow. Is there a way
>> i can provide it with spark-internal file? I am unable to find what this
>> spark-internal file does and i am unable to find it anywhere. only thing i
>> found related to this was in ContextLauncher.java line 232 `
>> launcher.setAppResource("spark-internal");`
>>
>>
>> Failed to fetch spark-internal
>>
>>                   I1025 10:41:45.202145 27901 fetcher.cpp:531] Fetcher Info: {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14","items":[{"action":"BYPASS_CACHE","uri":{"cache":false,"extract":true,"value":"spark-internal"}}],"sandbox_directory":"\/var\/lib\/mesos\/slave\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14\/frameworks\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-0003\/executors\/driver-20171025104145-0034\/runs\/9976b5de-0ed7-428e-a23c-59056c6f29ed"}
>> I1025 10:41:45.204314 27901 fetcher.cpp:442] Fetching URI 'spark-internal'
>> I1025 10:41:45.204329 27901 fetcher.cpp:283] Fetching directly into the sandbox directory
>> I1025 10:41:45.204346 27901 fetcher.cpp:220] Fetching URI 'spark-internal'
>> Failed to fetch 'spark-internal': A relative path was passed for the resource but the Mesos framework home was not specified. Please either provide this config option or avoid using a relative path
>>
>>                 [image: Mixmax]
>> <https://mixmax.com/r/592fe3b429b79365389d2354> Not using Mixmax yet?
>> <https://mixmax.com/r/592fe3b429b79365389d2354>
>>
>>
>> Spark-submit works fine I am unable to run a job with following command
>>
>> bin/spark-submit --class org.apache.spark.examples.SparkPi --master
>> mesos://10.128.0.23:22401 --deploy-mode cluster --supervise
>> --executor-memory 1G --total-executor-cores 1 --conf
>> spark.mesos.executor.docker.image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
>> --conf spark.executor.home=/opt/spark/dist https://downloads.mesosphere.
>> com/spark/examples/pi.py 30
>>
>>
>> I have set spark master to the same ip and set cluster mode. sending post
>> request to session with these parameters
>>
>> data = {'kind': 'spark','conf':{'spark.mesos.executor.docker.image':'
>> mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6 ','spark.executor.home':'/opt/spark/dist'}}
>>
>>
>>
>> Any help would be highly appreciated
>>
>

Re: Livy with DC/OS Mesos

Posted by Junaid Nasir <jn...@an10.io>.
livy version 0.4trying to create interactive session. may be it's not
supported with mesos?  





On Wed, Oct 25, 2017 3:55 PM, Junaid Nasir jnasir@an10.io  wrote:
Hi,
I am trying to run Livy with DC/OS Mesos.it tries to create driver but that
fails with error blow. Is there a way i can provide it with spark-internal file?
I am unable to find what this spark-internal file does and i am unable to find
it anywhere. only thing i found related to this was in ContextLauncher.java line
232 ` launcher.setAppResource("spark-internal");`

Failed to fetch spark-internal                   I1025 10:41:45.202145 27901 fetcher.cpp:531] Fetcher Info: {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14","items":[{"action":"BYPASS_CACHE","uri":{"cache":false,"extract":true,"value":"spark-internal"}}],"sandbox_directory":"\/var\/lib\/mesos\/slave\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14\/frameworks\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-0003\/executors\/driver-20171025104145-0034\/runs\/9976b5de-0ed7-428e-a23c-59056c6f29ed"}I1025 10:41:45.204314 27901 fetcher.cpp:442] Fetching URI 'spark-internal'I1025 10:41:45.204329 27901 fetcher.cpp:283] Fetching directly into the sandbox directoryI1025 10:41:45.204346 27901 fetcher.cpp:220] Fetching URI 'spark-internal'Failed to fetch 'spark-internal': A relative path was passed for the resource but the Mesos framework home was not specified. Please either provide this config option or avoid using a relative path
                

Not using Mixmax yet?  

Spark-submit works fine I am unable to run a job with following command
bin/spark-submit --class org.apache.spark.examples.SparkPi --master
mesos://10.128.0.23:22401 --deploy-mode cluster --supervise --executor-memory 1G
--total-executor-cores 1 --conf
spark.mesos.executor.docker.image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
--conf spark.executor.home=/opt/spark/dist
https://downloads.mesosphere.com/spark/examples/pi.py 30
I have set spark master to the same ip and set cluster mode. sending post
request to session with these parametersdata = {'kind':
'spark','conf':{'spark.mesos.executor.docker.image':'mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
','spark.executor.home':'/opt/spark/dist'}} 
Any help would be highlyappreciated