You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Junaid Nasir <jn...@an10.io> on 2017/10/15 20:29:39 UTC

NoSuchFileException: spark-internal - when creating interactive session on remote spark

Hi everyone,I am using livy (livy-0.4.0-incubating-bin), and a remote spark
(spark-2.2.0-bin-hadoop2.7) stand alone cluster. livy's setting is default
except setting livy.spark.master = spark://10.128.1.1:6066 and
livy.spark.deploy-mode = cluster (have tried with default deploy-mode but then
it throws a different error)when i try to create an interactive spark session on
remote cluster it creates the driver and immediately sends it to error state.  
livy console error                   17/10/15 20:18:08 INFO LineBufferedStream: stdout: 17/10/15 20:18:08 INFO RestSubmissionClient: Submitting a request to launch an application in spark://10.128.1.1:6066.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Submission successfully created as driver-20171015201808-0002. Polling submission state...17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Submitting a request for the status of submission driver-20171015201808-0002 in spark://10.128.1.1:6066.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: State of driver driver-20171015201808-0002 is now ERROR.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Driver is running on worker worker-20171015195836-10.128.1.1-38097 at 10.128.1.1:38097.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 ERROR RestSubmissionClient: Exception from the cluster:17/10/15 20:18:09 INFO LineBufferedStream: stdout: java.nio.file.NoSuchFileException: spark-internal17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      java.nio.file.Files.copy(Files.java:1274)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:625)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.copyFile(Utils.scala:596)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.doFetchFile(Utils.scala:681)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.fetchFile(Utils.scala:480)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:155)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:173)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:92)17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Server responded with CreateSubmissionResponse:17/10/15 20:18:09 INFO LineBufferedStream: stdout: {17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "action" : "CreateSubmissionResponse",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "message" : "Driver successfully submitted as driver-20171015201808-0002",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "serverSparkVersion" : "2.2.0",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "submissionId" : "driver-20171015201808-0002",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "success" : true17/10/15 20:18:09 INFO LineBufferedStream: stdout: }
                

Not using Mixmax yet?  
can anyone please guide me how to resolve this?
I have tried setting up spark locally and it works fine. so i guess problem is
with uploading jar files to remote cluster. I am using GCE for both spark
cluster and livy server but they have all ports (on internal network open)
spark worker log show this too  
                  17/10/15 20:18:08 INFO Worker: Asked to launch driver driver-20171015201808-000217/10/15 20:18:08 INFO DriverRunner: Copying user jar spark-internal to /home/junaid.nasir/spark-2.2.0-bin-hadoop2.7/work/driver-20171015201808-0002/spark-internal17/10/15 20:18:08 INFO Utils: Copying /spark-internal to /home/junaid.nasir/spark-2.2.0-bin-hadoop2.7/work/driver-20171015201808-0002/spark-internal17/10/15 20:18:08 INFO DriverRunner: Killing driver process!17/10/15 20:18:08 WARN Worker: Driver driver-20171015201808-0002 failed with unrecoverable exception: java.nio.file.NoSuchFileException: spark-internal
                

Not using Mixmax yet?

Re: NoSuchFileException: spark-internal - when creating interactive session on remote spark

Posted by Junaid Nasir <jn...@an10.io>.
I followed https://livy.incubator.apache.org/examples/ to setup session for
pyspark using python. will any logs be useful to debug? also any plan to support
stand alone spark?I will try this with DC/OS spark deployment tomorrow. will
update this thread if that works or not  





On Mon, Oct 16, 2017 11:16 AM, Saisai Shao sai.sai.shao@gmail.com  wrote:
Would you please provide more information about how you create a Livy session?
As for now, Livy only supports spark on yarn and local mode officially, we don't
test on standalone cluster mode, so maybe there's some issues in it.
On Mon, Oct 16, 2017 at 4:29 AM, Junaid Nasir <jn...@an10.io>  wrote:
Hi everyone,I am using livy (livy-0.4.0-incubating-bin), and a remote spark
(spark-2.2.0-bin-hadoop2.7) stand alone cluster. livy's setting is default
except setting livy.spark.master = spark://10.128.1.1:6066  and
livy.spark.deploy-mode = cluster (have tried with default deploy-mode but then
it throws a different error)when i try to create an interactive spark session on
remote cluster it creates the driver and immediately sends it to error state.  
livy console error                   17/10/15 20:18:08 INFO LineBufferedStream: stdout: 17/10/15 20:18:08 INFO RestSubmissionClient: Submitting a request to launch an application in spark://10.128.1.1:6066.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Submission successfully created as driver-20171015201808-0002. Polling submission state...17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Submitting a request for the status of submission driver-20171015201808-0002 in spark://10.128.1.1:6066.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: State of driver driver-20171015201808-0002 is now ERROR.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Driver is running on worker worker-20171015195836-10.128.1.1-38097 at 10.128.1.1:38097.17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 ERROR RestSubmissionClient: Exception from the cluster:17/10/15 20:18:09 INFO LineBufferedStream: stdout: java.nio.file.NoSuchFileException: spark-internal17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      java.nio.file.Files.copy(Files.java:1274)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:625)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.copyFile(Utils.scala:596)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.doFetchFile(Utils.scala:681)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.fetchFile(Utils.scala:480)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:155)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:173)17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:92)17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Server responded with CreateSubmissionResponse:17/10/15 20:18:09 INFO LineBufferedStream: stdout: {17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "action" : "CreateSubmissionResponse",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "message" : "Driver successfully submitted as driver-20171015201808-0002",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "serverSparkVersion" : "2.2.0",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "submissionId" : "driver-20171015201808-0002",17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "success" : true17/10/15 20:18:09 INFO LineBufferedStream: stdout: }
                

Not using Mixmax yet?  
can anyone please guide me how to resolve this?
I have tried setting up spark locally and it works fine. so i guess problem is
with uploading jar files to remote cluster. I am using GCE for both spark
cluster and livy server but they have all ports (on internal network open)
spark worker log show this too  
                  17/10/15 20:18:08 INFO Worker: Asked to launch driver driver-20171015201808-000217/10/15 20:18:08 INFO DriverRunner: Copying user jar spark-internal to /home/junaid.nasir/spark-2.2.0-bin-hadoop2.7/work/driver-20171015201808-0002/spark-internal17/10/15 20:18:08 INFO Utils: Copying /spark-internal to /home/junaid.nasir/spark-2.2.0-bin-hadoop2.7/work/driver-20171015201808-0002/spark-internal17/10/15 20:18:08 INFO DriverRunner: Killing driver process!17/10/15 20:18:08 WARN Worker: Driver driver-20171015201808-0002 failed with unrecoverable exception: java.nio.file.NoSuchFileException: spark-internal
                

Not using Mixmax yet?  




Sent with Mixmax

Re: NoSuchFileException: spark-internal - when creating interactive session on remote spark

Posted by Saisai Shao <sa...@gmail.com>.
Would you please provide more information about how you create a Livy
session? As for now, Livy only supports spark on yarn and local mode
officially, we don't test on standalone cluster mode, so maybe there's some
issues in it.

On Mon, Oct 16, 2017 at 4:29 AM, Junaid Nasir <jn...@an10.io> wrote:

> Hi everyone,
> I am using livy (livy-0.4.0-incubating-bin), and a remote spark
> (spark-2.2.0-bin-hadoop2.7) stand alone cluster. livy's setting is default
> except setting livy.spark.master = spark://10.128.1.1:6066 and
> livy.spark.deploy-mode = cluster (have tried with default deploy-mode but
> then it throws a different error)
> when i try to create an interactive spark session on remote cluster it
> creates the driver and immediately sends it to error state.
>
> livy console error
>
>                   17/10/15 20:18:08 INFO LineBufferedStream: stdout: 17/10/15 20:18:08 INFO RestSubmissionClient: Submitting a request to launch an application in spark://10.128.1.1:6066.
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Submission successfully created as driver-20171015201808-0002. Polling submission state...
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Submitting a request for the status of submission driver-20171015201808-0002 in spark://10.128.1.1:6066.
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: State of driver driver-20171015201808-0002 is now ERROR.
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Driver is running on worker worker-20171015195836-10.128.1.1-38097 at 10.128.1.1:38097.
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 ERROR RestSubmissionClient: Exception from the cluster:
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: java.nio.file.NoSuchFileException: spark-internal
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      java.nio.file.Files.copy(Files.java:1274)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:625)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.copyFile(Utils.scala:596)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.doFetchFile(Utils.scala:681)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.util.Utils$.fetchFile(Utils.scala:480)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:155)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:173)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:      org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:92)
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: 17/10/15 20:18:09 INFO RestSubmissionClient: Server responded with CreateSubmissionResponse:
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: {
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "action" : "CreateSubmissionResponse",
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "message" : "Driver successfully submitted as driver-20171015201808-0002",
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "serverSparkVersion" : "2.2.0",
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "submissionId" : "driver-20171015201808-0002",
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout:   "success" : true
> 17/10/15 20:18:09 INFO LineBufferedStream: stdout: }
>
>                 [image: Mixmax]
> <https://mixmax.com/r/592fe3b429b79365389d2354> Not using Mixmax yet?
> <https://mixmax.com/r/592fe3b429b79365389d2354>
>
> can anyone please guide me how to resolve this?
>
> I have tried setting up spark locally and it works fine. so i guess
> problem is with uploading jar files to remote cluster. I am using GCE for
> both spark cluster and livy server but they have all ports (on internal
> network open)
>
> spark worker log show this too
>
>                   17/10/15 20:18:08 INFO Worker: Asked to launch driver driver-20171015201808-0002
> 17/10/15 20:18:08 INFO DriverRunner: Copying user jar spark-internal to /home/junaid.nasir/spark-2.2.0-bin-hadoop2.7/work/driver-20171015201808-0002/spark-internal
> 17/10/15 20:18:08 INFO Utils: Copying /spark-internal to /home/junaid.nasir/spark-2.2.0-bin-hadoop2.7/work/driver-20171015201808-0002/spark-internal
> 17/10/15 20:18:08 INFO DriverRunner: Killing driver process!
> 17/10/15 20:18:08 WARN Worker: Driver driver-20171015201808-0002 failed with unrecoverable exception: java.nio.file.NoSuchFileException: spark-internal
>
>                 [image: Mixmax]
> <https://mixmax.com/r/592fe3b429b79365389d2354> Not using Mixmax yet?
> <https://mixmax.com/r/592fe3b429b79365389d2354>
>
>