You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@livy.apache.org by "Tim Cederquist (JIRA)" <ji...@apache.org> on 2017/09/20 21:34:00 UTC

[jira] [Comment Edited] (LIVY-405) ERROR RSCClient: Failed to connect to context

    [ https://issues.apache.org/jira/browse/LIVY-405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16173874#comment-16173874 ] 

Tim Cederquist edited comment on LIVY-405 at 9/20/17 9:33 PM:
--------------------------------------------------------------

I also get this warning and not able to startup as it ends with a timeout error:
'17/09/20 19:55:01 INFO driver.RSCDriver: Starting RPC server...',
         '17/09/20 19:55:01 INFO rpc.RpcServer: Connected to the port 10001',
         '17/09/20 19:55:01 WARN rsc.RSCConf: Your hostname, '
         'my.fqdn.com', resolves to a loopback address, but '
         "we couldn't find any external IP address!",
         '17/09/20 19:55:01 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address '
         'if you need to bind to another address.',

I'm running cloudera spark 1.6, master=yarn-client. Everything works great with master=local even as the user from impersonation. I'm using impersonation, spnego, kerberos protected spark. I've tried to set the config in livy-client.conf for: livy.rsc.rpc.server.address = x.x.x.x (my address). However, it does not appear to be picking this setting up. I've added the classpath as the following in livy-env.sh: export LIVY_SERVER_JAVA_OPTS="-Xmx2g -classpath /opt/livy/conf" per the documentation to no affect. Using the 0.4 incubating release


was (Author: tceder):
I also get this warning and not able to startup as it ends with a timeout error:
'17/09/20 19:55:01 INFO driver.RSCDriver: Starting RPC server...',
         '17/09/20 19:55:01 INFO rpc.RpcServer: Connected to the port 10001',
         '17/09/20 19:55:01 WARN rsc.RSCConf: Your hostname, '
         'usva-prd-sn-02.astellasrwi.us, resolves to a loopback address, but '
         "we couldn't find any external IP address!",
         '17/09/20 19:55:01 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address '
         'if you need to bind to another address.',

I'm running cloudera spark 1.6, master=yarn-client. Everything works great with master=local even as the user from impersonation. I'm using impersonation, spnego, kerberos protected spark. I've tried to set the config in livy-client.conf for: livy.rsc.rpc.server.address = x.x.x.x (my address). However, it does not appear to be picking this setting up. I've added the classpath as the following in livy-env.sh: export LIVY_SERVER_JAVA_OPTS="-Xmx2g -classpath /opt/livy/conf" per the documentation to no affect. Using the 0.4 incubating release

> ERROR RSCClient: Failed to connect to context
> ---------------------------------------------
>
>                 Key: LIVY-405
>                 URL: https://issues.apache.org/jira/browse/LIVY-405
>             Project: Livy
>          Issue Type: Question
>          Components: RSC
>    Affects Versions: 0.4
>         Environment: Ubuntu 16.04LTS + Spark 2.1 + Hadoop 2.8 + Zeppelin 0.8
> I'm using the clustered Spark based on Hadoop 2.8.
>            Reporter: Inhwan Jung
>            Priority: Minor
>
> Hello,
> I've been trying to resolve it, but I couldn't. 
> Please  help me.
> spark@alpha001:/usr/local/livy$ pg test.py
> import json, pprint, requests, textwrap
> host = 'http://192.168.0.69:8998'
> data = {'kind': 'spark'}
> headers = {'Content-Type': 'application/json'}
> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
> pprint.pprint(r.json())
> #session_url = 'http://localhost:8998/sessions/0'
> #requests.delete(session_url, headers=headers)
> spark@alpha001:/usr/local/livy$ python test.py
> 17/09/20 07:15:00 WARN InteractiveSession$: Enable HiveContext but no hive-site.xml found under classpath or user request.
> 17/09/20 07:15:00 INFO InteractiveSession$: Creating Interactive session 0: [owner: null, request: [kind: spark, proxyUser: None, heartbeatTimeoutInSecond: 0]]
> 17/09/20 07:15:01 INFO RpcServer: Connected to the port 10000
> 17/09/20 07:15:01 WARN RSCConf: Your hostname, alpha001, resolves to a loopback address, but we couldn't find any external IP address!
> 17/09/20 07:15:01 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
> 17/09/20 07:15:01 INFO InteractiveSessionManager: Registering new session 0
> {'appId': None,
>  'appInfo': {'driverLogUrl': None, 'sparkUiUrl': None},
>  'id': 0,
>  'kind': 'spark',
>  'log': ['stdout: ', '\nstderr: '],
>  'owner': None,
>  'proxyUser': None,
>  'state': 'starting'}
> spark@alpha001:/usr/local/livy$ 17/09/20 07:15:01 INFO LineBufferedStream: stdout: Running Spark using the REST application submission protocol.
> 17/09/20 07:15:01 INFO LineBufferedStream: stdout: 17/09/20 07:15:01 WARN SparkConf: The configuration key 'spark.yarn.jar' has been deprecated as of Spark 2.0 and may be removed in the future. Please use the new key 'spark.yarn.jars' instead.
> 17/09/20 07:15:01 INFO LineBufferedStream: stdout: 17/09/20 07:15:01 INFO RestSubmissionClient: Submitting a request to launch an application in spark://192.168.0.69:6066.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO RestSubmissionClient: Submission successfully created as driver-20170920071502-0000. Polling submission state...
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO RestSubmissionClient: Submitting a request for the status of submission driver-20170920071502-0000 in spark://192.168.0.69:6066.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO RestSubmissionClient: State of driver driver-20170920071502-0000 is now RUNNING.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO RestSubmissionClient: Driver is running on worker worker-20170920071400-192.168.0.47-43404 at 192.168.0.47:43404.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO RestSubmissionClient: Server responded with CreateSubmissionResponse:
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: {
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "action" : "CreateSubmissionResponse",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "message" : "Driver successfully submitted as driver-20170920071502-0000",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "serverSparkVersion" : "2.1.1",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "submissionId" : "driver-20170920071502-0000",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "success" : true
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: }
> 17/09/20 07:16:31 ERROR RSCClient: Failed to connect to context.
> java.util.concurrent.TimeoutException: Timed out waiting for context to start.
>         at org.apache.livy.rsc.ContextLauncher.connectTimeout(ContextLauncher.java:134)
>         at org.apache.livy.rsc.ContextLauncher.access$300(ContextLauncher.java:63)
>         at org.apache.livy.rsc.ContextLauncher$2.run(ContextLauncher.java:122)
>         at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>         at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
>         at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>         at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>         at java.lang.Thread.run(Thread.java:748)
> 17/09/20 07:16:31 INFO RSCClient: Failing pending job 38e22e17-444f-4712-a587-a77c89b214c3 due to shutdown.
> 17/09/20 07:16:31 INFO InteractiveSession: Failed to ping RSC driver for session 0. Killing application.
> 17/09/20 07:16:31 INFO InteractiveSession: Stopping InteractiveSession 0...
> 17/09/20 07:16:31 INFO InteractiveSession: Stopped InteractiveSession 0.
> 17/09/20 07:16:31 WARN InteractiveSession: (Fail to get rsc uri,java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for context to start.)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)