You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by mvle <mv...@us.ibm.com> on 2015/10/05 17:41:21 UTC

Spark on YARN using Java 1.8 fails

Hi,

I have successfully run pyspark on Spark 1.5.1 on YARN 2.7.1 with Java
OpenJDK 1.7.
However, when I run the same test on Java OpenJDK 1.8 (or Oracle Java 1.8),
I cannot start up pyspark.
Has anyone been able to run Spark on YARN with Java 1.8?

I get ApplicationMaster disassociated messages...

15/10/05 09:55:05 INFO cluster.YarnClientSchedulerBackend: Application
application_1444055784612_0003 has started running.
15/10/05 09:55:05 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 53518.
15/10/05 09:55:05 INFO netty.NettyBlockTransferService: Server created on
53518
15/10/05 09:55:05 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/10/05 09:55:05 INFO storage.BlockManagerMasterEndpoint: Registering block
manager xxx.172.232.xx:53518 with 530.0 MB RAM, BlockManagerId(driver,
xxx.172.232.xx, 53518)
15/10/05 09:55:05 INFO storage.BlockManagerMaster: Registered BlockManager
15/10/05 09:55:05 INFO scheduler.EventLoggingListener: Logging events to
hdfs://h-all1-001:9000/user/hadoop/sparklogs/application_1444055784612_0003
15/10/05 09:55:07 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster has disassociated: xxx.172.232.xx:42628
15/10/05 09:55:07 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkYarnAM@xxx.172.232.xx:42628] has failed,
address is now gated for [5000] ms. Reason: [Disassociated]
15/10/05 09:55:07 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster has disassociated: xxx.172.232.xx:42628
15/10/05 09:55:09 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster registered as
AkkaRpcEndpointRef(Actor[akka.tcp://sparkYarnAM@xxx.172.232.xx:60077/user/YarnAM#-560267402])
15/10/05 09:55:09 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter.
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS
-> h-all1-001, PROXY_URI_BASES ->
http://h-all1-001:8088/proxy/application_1444055784612_0003),
/proxy/application_1444055784612_0003
15/10/05 09:55:09 INFO ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/10/05 09:55:13 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster has disassociated: xxx.172.232.xx:60077
15/10/05 09:55:13 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster has disassociated: xxx.172.232.xx:60077
15/10/05 09:55:13 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkYarnAM@xxx.172.232.xx:60077] has failed,
address is now gated for [5000] ms. Reason: [Disassociated]
15/10/05 09:55:13 ERROR cluster.YarnClientSchedulerBackend: Yarn application
has already exited with state FINISHED!
15/10/05 09:55:13 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}
15/10/05 09:55:13 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/10/05 09:55:13 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark on YARN using Java 1.8 fails

Posted by Ted Yu <yu...@gmail.com>.
YARN 2.7.1 (running on the cluster) was built with Java 1.8, I assume.

Have you used the following command to retrieve / inspect logs ?
yarn logs -applicationId

Cheers

On Mon, Oct 5, 2015 at 8:41 AM, mvle <mv...@us.ibm.com> wrote:

> Hi,
>
> I have successfully run pyspark on Spark 1.5.1 on YARN 2.7.1 with Java
> OpenJDK 1.7.
> However, when I run the same test on Java OpenJDK 1.8 (or Oracle Java 1.8),
> I cannot start up pyspark.
> Has anyone been able to run Spark on YARN with Java 1.8?
>
> I get ApplicationMaster disassociated messages...
>
> 15/10/05 09:55:05 INFO cluster.YarnClientSchedulerBackend: Application
> application_1444055784612_0003 has started running.
> 15/10/05 09:55:05 INFO util.Utils: Successfully started service
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 53518.
> 15/10/05 09:55:05 INFO netty.NettyBlockTransferService: Server created on
> 53518
> 15/10/05 09:55:05 INFO storage.BlockManagerMaster: Trying to register
> BlockManager
> 15/10/05 09:55:05 INFO storage.BlockManagerMasterEndpoint: Registering
> block
> manager xxx.172.232.xx:53518 with 530.0 MB RAM, BlockManagerId(driver,
> xxx.172.232.xx, 53518)
> 15/10/05 09:55:05 INFO storage.BlockManagerMaster: Registered BlockManager
> 15/10/05 09:55:05 INFO scheduler.EventLoggingListener: Logging events to
> hdfs://h-all1-001:9000/user/hadoop/sparklogs/application_1444055784612_0003
> 15/10/05 09:55:07 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
> ApplicationMaster has disassociated: xxx.172.232.xx:42628
> 15/10/05 09:55:07 WARN remote.ReliableDeliverySupervisor: Association with
> remote system [akka.tcp://sparkYarnAM@xxx.172.232.xx:42628] has failed,
> address is now gated for [5000] ms. Reason: [Disassociated]
> 15/10/05 09:55:07 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
> ApplicationMaster has disassociated: xxx.172.232.xx:42628
> 15/10/05 09:55:09 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
> ApplicationMaster registered as
> AkkaRpcEndpointRef(Actor[akka.tcp://sparkYarnAM@xxx.172.232.xx
> :60077/user/YarnAM#-560267402])
> 15/10/05 09:55:09 INFO cluster.YarnClientSchedulerBackend: Add WebUI
> Filter.
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS
> -> h-all1-001, PROXY_URI_BASES ->
> http://h-all1-001:8088/proxy/application_1444055784612_0003),
> /proxy/application_1444055784612_0003
> 15/10/05 09:55:09 INFO ui.JettyUtils: Adding filter:
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
> 15/10/05 09:55:13 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
> ApplicationMaster has disassociated: xxx.172.232.xx:60077
> 15/10/05 09:55:13 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
> ApplicationMaster has disassociated: xxx.172.232.xx:60077
> 15/10/05 09:55:13 WARN remote.ReliableDeliverySupervisor: Association with
> remote system [akka.tcp://sparkYarnAM@xxx.172.232.xx:60077] has failed,
> address is now gated for [5000] ms. Reason: [Disassociated]
> 15/10/05 09:55:13 ERROR cluster.YarnClientSchedulerBackend: Yarn
> application
> has already exited with state FINISHED!
> 15/10/05 09:55:13 INFO handler.ContextHandler: stopped
> o.s.j.s.ServletContextHandler{/metrics/json,null}
> 15/10/05 09:55:13 INFO handler.ContextHandler: stopped
> o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
> 15/10/05 09:55:13 INFO handler.ContextHandler: stopped
> o.s.j.s.ServletContextHandler{/api,null}
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Spark on YARN using Java 1.8 fails

Posted by Abel Rincón <ga...@gmail.com>.
Hi,

There was another related question

https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201506.mbox/%3CCAJ2peNeruM2Y2Tbf8-Wiras-weE586LM_o25FsN=+Z1-BFWsnw@mail.gmail.com%3E


Some months ago, if I remember well, using spark 1.3 + YARN + Java 8 we had
the same probem.
https://issues.apache.org/jira/browse/SPARK-6388

BTW, nowadays we chose use java 7

Re: Spark on YARN using Java 1.8 fails

Posted by mvle <mv...@us.ibm.com>.
Unfortunately, no. I switched back to OpenJDK 1.7.
Didn't get a chance to dig deeper.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925p25360.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark on YARN using Java 1.8 fails

Posted by Abhisheks <sm...@gmail.com>.
Did you get any resolution for this?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925p25039.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org