You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by "Premachandran, Mahesh (Nokia - IN/Bangalore)" <ma...@nokia.com> on 2018/06/01 06:06:50 UTC

Spark + Ignite standalone mode on Kubernetes cluster.

Hello,

I have been trying to use Spark with Ignite (standalone mode) on a Kubernetes cluster and is running into some issues. Some details about the setup.

K8s Version: 1.9.7
Spark Version: 2.3.0
Ignite Version: 2.4.0
RBAC disabled.

I have been using the example provided here<https://github.com/apache/ignite/blob/master/examples/src/main/spark/org/apache/ignite/examples/spark/SharedRDDExample.java>, only difference is that client mode is set to true. The stack trace is
2018-06-01 05:14:53 ERROR TcpDiscoverySpi:310 - Failed to get registered addresses from IP finder on start (retrying every 2000ms; change 'reconnectDelay' to configure the frequency of retries).
class org.apache.ignite.spi.IgniteSpiException: Failed to retrieve Ignite pods IP addresses.
        at org.apache.ignite.spi.discovery.tcp.ipfinder.kubernetes.TcpDiscoveryKubernetesIpFinder.getRegisteredAddresses(TcpDiscoveryKubernetesIpFinder.java:172)
        at org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi.registeredAddresses(TcpDiscoverySpi.java:1810)
        at org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi.resolvedAddresses(TcpDiscoverySpi.java:1758)
        at org.apache.ignite.spi.discovery.tcp.ClientImpl.joinTopology(ClientImpl.java:494)
        at org.apache.ignite.spi.discovery.tcp.ClientImpl.access$900(ClientImpl.java:125)
        at org.apache.ignite.spi.discovery.tcp.ClientImpl$MessageWorker.tryJoin(ClientImpl.java:1837)
        at org.apache.ignite.spi.discovery.tcp.ClientImpl$MessageWorker.body(ClientImpl.java:1550)
        at org.apache.ignite.spi.IgniteSpiThread.run(IgniteSpiThread.java:62)
Caused by: java.io.FileNotFoundException: https://kubernetes.default.svc.cluster.local:443/api/v1/namespaces/default/endpoints/ignite
        at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1890)
        at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
        at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:263)
        at org.apache.ignite.spi.discovery.tcp.ipfinder.kubernetes.TcpDiscoveryKubernetesIpFinder.getRegisteredAddresses(TcpDiscoveryKubernetesIpFinder.java:153)
        ... 7 more

I have explicitly set the masterUrl for the k8s in the config xml

<bean class="org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi">
      <property name="ipFinder">
        <bean class="org.apache.ignite.spi.discovery.tcp.ipfinder.kubernetes.TcpDiscoveryKubernetesIpFinder">
         <property name="namespace" value="default"/>
        <property name="serviceName" value="ignite-spark"/>
        <property name="masterUrl" value="https://k8s-apiserver.bcmt.cluster.local:8443"/>

        </bean>
      </property>
    </bean>

but for some reason it is defaulting to the address https://kubernetes.default.svc.cluster.local:443/api/v1/namespaces/default/endpoints/ignite . I have also attached the spark pod logs.  I found a similar issue on the mailing list<http://apache-ignite-users.70518.x6.nabble.com/Unable-to-connect-ignite-pods-in-Kubernetes-using-Ip-finder-td18009.html> but RBAC is disabled on our k8s cluster.

The spark submit command I used is as follows.

bin/spark-submit     --master k8s://https://k8s-apiserver.bcmt.cluster.local:8443     --deploy-mode cluster     --name JavaIgniteRDDExample     --class com.nokia.SharedRDDExample    --conf spark.executor.instances=2     --conf spark.kubernetes.container.image=bcmt-registry:5000/spark:2.3 --jars <path/url to jars>  <application_path/URL>






Re: Spark + Ignite standalone mode on Kubernetes cluster.

Posted by Andrey Mashenkov <an...@gmail.com>.
Hi,

It looks like a linkage error.
Please, check dependencies versions. E.g. Jackson or Scala version.

On Wed, Jun 6, 2018 at 6:47 AM, Premachandran, Mahesh (Nokia -
IN/Bangalore) <ma...@nokia.com> wrote:

> We are trying to integrate spark 2.2.0 with Ignite 2.4 on kubernetes, and
> getting the following exception.
>
> 2018-06-06 03:23:51 INFO  GridDiscoveryManager:495 - Topology snapshot
> [ver=3, servers=2, clients=1, CPUs=12, offheap=3.1GB, heap=3.0GB]
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.util.matching.Regex.unapplySeq(Ljava/lang/
> CharSequence;)Lscala/Option;
>         at com.fasterxml.jackson.module.scala.JacksonModule$.version$
> lzycompute(JacksonModule.scala:30)
>         at com.fasterxml.jackson.module.scala.JacksonModule$.version(
> JacksonModule.scala:26)
>         at com.fasterxml.jackson.module.scala.JacksonModule$class.
> version(JacksonModule.scala:49)
>         at com.fasterxml.jackson.module.scala.DefaultScalaModule.
> version(DefaultScalaModule.scala:19)
>         at com.fasterxml.jackson.databind.ObjectMapper.
> registerModule(ObjectMapper.java:710)
>         at org.apache.spark.rdd.RDDOperationScope$.<init>(
> RDDOperationScope.scala:82)
>         at org.apache.spark.rdd.RDDOperationScope$.<clinit>(
> RDDOperationScope.scala)
>         at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
>         at org.apache.spark.SparkContext.parallelize(SparkContext.
> scala:718)
>         at org.apache.spark.api.java.JavaSparkContext.parallelize(
> JavaSparkContext.scala:134)
>         at org.apache.spark.api.java.JavaSparkContext.parallelize(
> JavaSparkContext.scala:146)
>         at org.apache.ignite.examples.spark.SharedRDDExample.main(
> SharedRDDExample.java:81)
>
> Two spark executors connecting to ignite in standalone mode. We are using
> Ignite's sharedRDD example (https://github.com/apache/
> ignite/blob/2.4.0/examples/src/main/spark/org/apache/
> ignite/examples/spark/SharedRDDExample.java)
>
> The example jar and ignite was built with scala version 2.11. The ignite
> xml is attached.
>
>
>
> -----Original Message-----
> From: Ray [mailto:rayliu@cisco.com]
> Sent: Friday, June 01, 2018 12:00 PM
> To: user@ignite.apache.org
> Subject: Re: Spark + Ignite standalone mode on Kubernetes cluster.
>
> Currently, Ignite supports Spark up to version 2.2.0.
> Please try with Spark 2.2.0 or wait until this ticket is resolved.
> https://issues.apache.org/jira/browse/IGNITE-8534
>
>
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>



-- 
Best regards,
Andrey V. Mashenkov

RE: Spark + Ignite standalone mode on Kubernetes cluster.

Posted by "Premachandran, Mahesh (Nokia - IN/Bangalore)" <ma...@nokia.com>.
We are trying to integrate spark 2.2.0 with Ignite 2.4 on kubernetes, and getting the following exception. 

2018-06-06 03:23:51 INFO  GridDiscoveryManager:495 - Topology snapshot [ver=3, servers=2, clients=1, CPUs=12, offheap=3.1GB, heap=3.0GB]
Exception in thread "main" java.lang.NoSuchMethodError: scala.util.matching.Regex.unapplySeq(Ljava/lang/CharSequence;)Lscala/Option;
        at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:30)
        at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26)
        at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49)
        at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19)
        at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710)
        at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
        at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
        at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
        at org.apache.spark.SparkContext.parallelize(SparkContext.scala:718)
        at org.apache.spark.api.java.JavaSparkContext.parallelize(JavaSparkContext.scala:134)
        at org.apache.spark.api.java.JavaSparkContext.parallelize(JavaSparkContext.scala:146)
        at org.apache.ignite.examples.spark.SharedRDDExample.main(SharedRDDExample.java:81)

Two spark executors connecting to ignite in standalone mode. We are using Ignite's sharedRDD example (https://github.com/apache/ignite/blob/2.4.0/examples/src/main/spark/org/apache/ignite/examples/spark/SharedRDDExample.java)

The example jar and ignite was built with scala version 2.11. The ignite xml is attached. 



-----Original Message-----
From: Ray [mailto:rayliu@cisco.com] 
Sent: Friday, June 01, 2018 12:00 PM
To: user@ignite.apache.org
Subject: Re: Spark + Ignite standalone mode on Kubernetes cluster.

Currently, Ignite supports Spark up to version 2.2.0.
Please try with Spark 2.2.0 or wait until this ticket is resolved.
https://issues.apache.org/jira/browse/IGNITE-8534





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Spark + Ignite standalone mode on Kubernetes cluster.

Posted by Ray <ra...@cisco.com>.
Currently, Ignite supports Spark up to version 2.2.0.
Please try with Spark 2.2.0 or wait until this ticket is resolved.
https://issues.apache.org/jira/browse/IGNITE-8534





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/