You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Srujan A (Jira)" <ji...@apache.org> on 2019/11/08 22:57:00 UTC

[jira] [Updated] (SPARK-29804) Spark-shell is failing on YARN mode

     [ https://issues.apache.org/jira/browse/SPARK-29804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Srujan A updated SPARK-29804:
-----------------------------
    Docs Text: 
[root@hadoop]# /usr/lib/spark/spark-2.4.4/bin/spark-shell
[main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
[main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
[main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
[main] INFO org.apache.spark.util.SignalUtils - Registered signal handler for INT
[main] INFO org.apache.spark.SparkContext - Running Spark version 2.4.4
[main] INFO org.apache.spark.SparkContext - Submitted application: Spark shell
[main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
[main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
[main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
[main] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 44125.
[main] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
[main] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
[main] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
[main] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - BlockManagerMasterEndpoint up
[main] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /tmp/blockmgr-ac4d034f-50a3-494b-818a-54f98b1c6e16
[main] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore started with capacity 93.3 MB
[main] INFO org.apache.spark.SparkEnv - Registering OutputCommitCoordinator
[main] INFO org.spark_project.jetty.util.log - Logging initialized @7542ms
[main] INFO org.spark_project.jetty.server.Server - jetty-9.3.z-SNAPSHOT, build timestamp: 2018-06-05T10:11:56-07:00, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827
[main] INFO org.spark_project.jetty.server.Server - Started @7628ms
[main] INFO org.spark_project.jetty.server.AbstractConnector - Started ServerConnector@6f740044{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
[main] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4040.
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@72e1e587{/jobs,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@f833223{/jobs/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5186b78a{/jobs/job,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1e749235{/jobs/job/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@870a9f2{/stages,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6c2e7591{/stages/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7f2542f{/stages/stage,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@2f82e3cf{/stages/stage/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3a44993c{/stages/pool,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6b6606d1{/stages/pool/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@27605b87{/storage,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1fc4483f{/storage/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1d7af82{/storage/rdd,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5adc71e7{/storage/rdd/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7fc5a558{/environment,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@42fd8f2f{/environment/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7eea934d{/executors,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@4700963e{/executors/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@153d14e3{/executors/threadDump,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3b17759c{/executors/threadDump/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@11cdf948{/static,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1b5d1d9{/,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@703a2bc9{/api,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@d17d554{/jobs/job/kill,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1f1c7fde{/stages/stage/kill,null,AVAILABLE,@Spark}
[main] INFO org.apache.spark.ui.SparkUI - Bound SparkUI to 0.0.0.0, and started at http://164.appdev.bdlocal:4040
[main] INFO org.apache.spark.SparkContext - Added JAR file:/opt/bluedata/bluedata-dtap.jar at spark://bluedata-164.appdev.bdlocal:44125/jars/ with timestamp 1573252886754
[main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at /0.0.0.0:8032
[main] INFO org.apache.spark.deploy.yarn.Client - Requesting a new application from cluster with 1 NodeManagers
[main] INFO org.apache.hadoop.conf.Configuration - resource-types.xml not found
[main] INFO org.apache.hadoop.yarn.util.resource.ResourceUtils - Unable to find 'resource-types.xml'.
[main] INFO org.apache.spark.deploy.yarn.Client - Verifying our application has not requested more than the maximum memory capability of the cluster (1536 MB per container)
[main] INFO org.apache.spark.deploy.yarn.Client - Will allocate AM container, with 896 MB memory including 384 MB overhead
[main] INFO org.apache.spark.deploy.yarn.Client - Setting up container launch context for our AM
[main] INFO org.apache.spark.deploy.yarn.Client - Setting up the launch environment for our AM container
[main] INFO org.apache.spark.deploy.yarn.Client - Preparing resources for our AM container
[main] WARN org.apache.spark.deploy.yarn.Client - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
[main] INFO org.apache.spark.deploy.yarn.Client - Uploading resource file:/tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641/__spark_libs__2727717899746864854.zip -> file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip
[main] INFO org.apache.spark.deploy.yarn.Client - Uploading resource file:/opt/bluedata/bluedata-dtap.jar -> file:/root/.sparkStaging/application_1573252848154_0001/bluedata-dtap.jar
[main] INFO org.apache.spark.deploy.yarn.Client - Uploading resource file:/tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641/__spark_conf__2009935022437236524.zip -> file:/root/.sparkStaging/application_1573252848154_0001/__spark_conf__.zip
[main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
[main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
[main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
[main] INFO org.apache.spark.deploy.yarn.Client - Submitting application application_1573252848154_0001 to ResourceManager
[main] INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl - Submitted application application_1573252848154_0001
[main] INFO org.apache.spark.scheduler.cluster.SchedulerExtensionServices - Starting Yarn extension services with app application_1573252848154_0001 and attemptId None
[main] INFO org.apache.spark.deploy.yarn.Client - Application report for application_1573252848154_0001 (state: ACCEPTED)
[main] INFO org.apache.spark.deploy.yarn.Client -
         client token: N/A
         diagnostics: [Fri Nov 08 14:41:30 -0800 2019] Scheduler has assigned a container for AM, waiting for AM container to be launched
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1573252889669
         final status: UNDEFINED
         tracking URL: http://bluedata-164.appdev.bdlocal:8088/proxy/application_1573252848154_0001/
         user: root
[main] INFO org.apache.spark.deploy.yarn.Client - Application report for application_1573252848154_0001 (state: ACCEPTED)
[main] INFO org.apache.spark.deploy.yarn.Client - Application report for application_1573252848154_0001 (state: FAILED)
[main] INFO org.apache.spark.deploy.yarn.Client -
         client token: N/A
         diagnostics: Application application_1573252848154_0001 failed 2 times due to AM Container for appattempt_1573252848154_0001_000002 exited with  exitCode: -1000
Failing this attempt.Diagnostics: [2019-11-08 14:41:32.802]File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
java.io.FileNotFoundException: File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
        at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

For more detailed output, check the application tracking page: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001 Then click on links to logs of each attempt.
. Failing the application.
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1573252889669
         final status: FAILED
         tracking URL: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001
         user: root
[main] INFO org.apache.spark.deploy.yarn.Client - Deleted staging directory file:/root/.sparkStaging/application_1573252848154_0001
[main] ERROR org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend - The YARN application has already ended! It might have been killed or the Application Master may have failed to start. Check the YARN application logs for more details.
[main] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.spark.SparkException: Application application_1573252848154_0001 failed 2 times due to AM Container for appattempt_1573252848154_0001_000002 exited with  exitCode: -1000
Failing this attempt.Diagnostics: [2019-11-08 14:41:32.802]File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
java.io.FileNotFoundException: File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
        at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

For more detailed output, check the application tracking page: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001 Then click on links to logs of each attempt.
. Failing the application.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:112)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[main] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@6f740044{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
[main] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://bluedata-164.appdev.bdlocal:4040
[dispatcher-event-loop-6] WARN org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint - Attempted to request executors before the AM has registered!
[main] INFO org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend - Shutting down all executors
[dispatcher-event-loop-7] INFO org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnDriverEndpoint - Asking each executor to shut down
[main] INFO org.apache.spark.scheduler.cluster.SchedulerExtensionServices - Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
[main] INFO org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend - Stopped
[dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
[main] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
[main] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
[main] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
[main] WARN org.apache.spark.metrics.MetricsSystem - Stopping a MetricsSystem that is not running
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
[main] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
[main] ERROR org.apache.spark.repl.Main - Failed to initialize Spark session.
org.apache.spark.SparkException: Application application_1573252848154_0001 failed 2 times due to AM Container for appattempt_1573252848154_0001_000002 exited with  exitCode: -1000
Failing this attempt.Diagnostics: [2019-11-08 14:41:32.802]File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
java.io.FileNotFoundException: File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
        at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

For more detailed output, check the application tracking page: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001 Then click on links to logs of each attempt.
. Failing the application.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:112)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-1e146ac5-66bc-4c65-b319-f5e20f15d4fc
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641/repl-3fb31706-552c-41f1-857e-76058f8e166c
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641


  was:
[root@bluedata-164 hadoop]# /usr/lib/spark/spark-2.4.4/bin/spark-shell
[main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
[main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
[main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
[main] INFO org.apache.spark.util.SignalUtils - Registered signal handler for INT
[main] INFO org.apache.spark.SparkContext - Running Spark version 2.4.4
[main] INFO org.apache.spark.SparkContext - Submitted application: Spark shell
[main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
[main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
[main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
[main] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 44125.
[main] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
[main] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
[main] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
[main] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - BlockManagerMasterEndpoint up
[main] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /tmp/blockmgr-ac4d034f-50a3-494b-818a-54f98b1c6e16
[main] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore started with capacity 93.3 MB
[main] INFO org.apache.spark.SparkEnv - Registering OutputCommitCoordinator
[main] INFO org.spark_project.jetty.util.log - Logging initialized @7542ms
[main] INFO org.spark_project.jetty.server.Server - jetty-9.3.z-SNAPSHOT, build timestamp: 2018-06-05T10:11:56-07:00, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827
[main] INFO org.spark_project.jetty.server.Server - Started @7628ms
[main] INFO org.spark_project.jetty.server.AbstractConnector - Started ServerConnector@6f740044{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
[main] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4040.
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@72e1e587{/jobs,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@f833223{/jobs/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5186b78a{/jobs/job,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1e749235{/jobs/job/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@870a9f2{/stages,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6c2e7591{/stages/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7f2542f{/stages/stage,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@2f82e3cf{/stages/stage/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3a44993c{/stages/pool,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6b6606d1{/stages/pool/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@27605b87{/storage,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1fc4483f{/storage/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1d7af82{/storage/rdd,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@5adc71e7{/storage/rdd/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7fc5a558{/environment,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@42fd8f2f{/environment/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7eea934d{/executors,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@4700963e{/executors/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@153d14e3{/executors/threadDump,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@3b17759c{/executors/threadDump/json,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@11cdf948{/static,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1b5d1d9{/,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@703a2bc9{/api,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@d17d554{/jobs/job/kill,null,AVAILABLE,@Spark}
[main] INFO org.spark_project.jetty.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1f1c7fde{/stages/stage/kill,null,AVAILABLE,@Spark}
[main] INFO org.apache.spark.ui.SparkUI - Bound SparkUI to 0.0.0.0, and started at http://bluedata-164.appdev.bdlocal:4040
[main] INFO org.apache.spark.SparkContext - Added JAR file:/opt/bluedata/bluedata-dtap.jar at spark://bluedata-164.appdev.bdlocal:44125/jars/bluedata-dtap.jar with timestamp 1573252886754
[main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at /0.0.0.0:8032
[main] INFO org.apache.spark.deploy.yarn.Client - Requesting a new application from cluster with 1 NodeManagers
[main] INFO org.apache.hadoop.conf.Configuration - resource-types.xml not found
[main] INFO org.apache.hadoop.yarn.util.resource.ResourceUtils - Unable to find 'resource-types.xml'.
[main] INFO org.apache.spark.deploy.yarn.Client - Verifying our application has not requested more than the maximum memory capability of the cluster (1536 MB per container)
[main] INFO org.apache.spark.deploy.yarn.Client - Will allocate AM container, with 896 MB memory including 384 MB overhead
[main] INFO org.apache.spark.deploy.yarn.Client - Setting up container launch context for our AM
[main] INFO org.apache.spark.deploy.yarn.Client - Setting up the launch environment for our AM container
[main] INFO org.apache.spark.deploy.yarn.Client - Preparing resources for our AM container
[main] WARN org.apache.spark.deploy.yarn.Client - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
[main] INFO org.apache.spark.deploy.yarn.Client - Uploading resource file:/tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641/__spark_libs__2727717899746864854.zip -> file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip
[main] INFO org.apache.spark.deploy.yarn.Client - Uploading resource file:/opt/bluedata/bluedata-dtap.jar -> file:/root/.sparkStaging/application_1573252848154_0001/bluedata-dtap.jar
[main] INFO org.apache.spark.deploy.yarn.Client - Uploading resource file:/tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641/__spark_conf__2009935022437236524.zip -> file:/root/.sparkStaging/application_1573252848154_0001/__spark_conf__.zip
[main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
[main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
[main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
[main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
[main] INFO org.apache.spark.deploy.yarn.Client - Submitting application application_1573252848154_0001 to ResourceManager
[main] INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl - Submitted application application_1573252848154_0001
[main] INFO org.apache.spark.scheduler.cluster.SchedulerExtensionServices - Starting Yarn extension services with app application_1573252848154_0001 and attemptId None
[main] INFO org.apache.spark.deploy.yarn.Client - Application report for application_1573252848154_0001 (state: ACCEPTED)
[main] INFO org.apache.spark.deploy.yarn.Client -
         client token: N/A
         diagnostics: [Fri Nov 08 14:41:30 -0800 2019] Scheduler has assigned a container for AM, waiting for AM container to be launched
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1573252889669
         final status: UNDEFINED
         tracking URL: http://bluedata-164.appdev.bdlocal:8088/proxy/application_1573252848154_0001/
         user: root
[main] INFO org.apache.spark.deploy.yarn.Client - Application report for application_1573252848154_0001 (state: ACCEPTED)
[main] INFO org.apache.spark.deploy.yarn.Client - Application report for application_1573252848154_0001 (state: FAILED)
[main] INFO org.apache.spark.deploy.yarn.Client -
         client token: N/A
         diagnostics: Application application_1573252848154_0001 failed 2 times due to AM Container for appattempt_1573252848154_0001_000002 exited with  exitCode: -1000
Failing this attempt.Diagnostics: [2019-11-08 14:41:32.802]File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
java.io.FileNotFoundException: File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
        at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

For more detailed output, check the application tracking page: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001 Then click on links to logs of each attempt.
. Failing the application.
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1573252889669
         final status: FAILED
         tracking URL: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001
         user: root
[main] INFO org.apache.spark.deploy.yarn.Client - Deleted staging directory file:/root/.sparkStaging/application_1573252848154_0001
[main] ERROR org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend - The YARN application has already ended! It might have been killed or the Application Master may have failed to start. Check the YARN application logs for more details.
[main] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.spark.SparkException: Application application_1573252848154_0001 failed 2 times due to AM Container for appattempt_1573252848154_0001_000002 exited with  exitCode: -1000
Failing this attempt.Diagnostics: [2019-11-08 14:41:32.802]File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
java.io.FileNotFoundException: File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
        at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

For more detailed output, check the application tracking page: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001 Then click on links to logs of each attempt.
. Failing the application.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:112)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[main] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@6f740044{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
[main] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://bluedata-164.appdev.bdlocal:4040
[dispatcher-event-loop-6] WARN org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint - Attempted to request executors before the AM has registered!
[main] INFO org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend - Shutting down all executors
[dispatcher-event-loop-7] INFO org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnDriverEndpoint - Asking each executor to shut down
[main] INFO org.apache.spark.scheduler.cluster.SchedulerExtensionServices - Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
[main] INFO org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend - Stopped
[dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
[main] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
[main] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
[main] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
[main] WARN org.apache.spark.metrics.MetricsSystem - Stopping a MetricsSystem that is not running
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
[main] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
[main] ERROR org.apache.spark.repl.Main - Failed to initialize Spark session.
org.apache.spark.SparkException: Application application_1573252848154_0001 failed 2 times due to AM Container for appattempt_1573252848154_0001_000002 exited with  exitCode: -1000
Failing this attempt.Diagnostics: [2019-11-08 14:41:32.802]File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
java.io.FileNotFoundException: File file:/root/.sparkStaging/application_1573252848154_0001/__spark_libs__2727717899746864854.zip does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:641)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:930)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:631)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
        at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:269)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:67)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:414)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:411)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:411)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:242)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:235)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:223)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

For more detailed output, check the application tracking page: http://bluedata-164.appdev.bdlocal:8088/cluster/app/application_1573252848154_0001 Then click on links to logs of each attempt.
. Failing the application.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:112)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:43)
        at $line3.$read.<init>(<console>:45)
        at $line3.$read$.<init>(<console>:49)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
        at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
        at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-1e146ac5-66bc-4c65-b319-f5e20f15d4fc
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641/repl-3fb31706-552c-41f1-857e-76058f8e166c
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cdeaaf05-639c-41c0-8be9-dc2cadc30641



> Spark-shell is failing on YARN mode
> -----------------------------------
>
>                 Key: SPARK-29804
>                 URL: https://issues.apache.org/jira/browse/SPARK-29804
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 2.4.4
>         Environment: Spark2.4.4, Apache Hadoop 3.1.2
>            Reporter: Srujan A
>            Priority: Blocker
>             Fix For: 2.4.4
>
>
> I am trying to run the spark-shell on YARN mode from containers and it's failing on below reason. Please help me out.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org