You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by Kristian Rosenvold <kr...@apache.org> on 2016/10/27 11:21:13 UTC

Cluster breaks down on network failure

We have 2 nodes in 2 datacenters, all four nodes are supposed to be
replicated. Our IT people do regular failover tests where they simulate
complete loss of comms between the two data centers.

Ignite consistently fails this test. It also fails to reconnect the cluster
in any way so we end up with a permanent split brain.

What can we do to make this work better ?

Kristian

Re: Cluster breaks down on network failure

Posted by luqmanahmad <lu...@gmail.com>.
See [1] for free network segmentation plugin

[1]  https://github.com/luqmanahmad/ignite-plugins
<https://github.com/luqmanahmad/ignite-plugins>  



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Cluster breaks down on network failure

Posted by vkulichenko <va...@gmail.com>.
Hi Kristian,

Once a node gets segmented, it can't be reconnected back to the cluster,
because Ignite doesn't merge the data after split brain. This is done to
guarantee data consistency.

If segmentation happens, you should stop segmented nodes and start them
again. Note that in your scenario this will mean restart of the full
cluster, because if connection is lost between data centers, all nodes will
be segmented. Generally, in this case you may want to define which part of
the cluster continues running and which one is restarted. This is not
available in Ignite, however GridGain provides this feature as a part of
their paid solution [1]. Also they have async data center replication that
probably fits your use case better [2].

[1] https://gridgain.readme.io/docs/network-segmentation
[2] https://gridgain.readme.io/docs/data-center-replication

-Val



--
View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Cluster-breaks-down-on-network-failure-tp8549p8566.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.