You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by mehdi sey <se...@gmail.com> on 2019/01/05 08:56:09 UTC

error in running shared rdd in ignite

hi, i have a code for writing into ignite rdd. this program read data from
spark rdd and catch it on ignite rdd. i run it with command line in Linux
Ubuntu but in the middle of execution i have encounter with below error. i
checked in spark UI for watching if job complete or not but the job is not
complete and failed. why? i have attached piece of code that i have wrote
and run with command.

this is my scala code:
package com.gridgain

import org.apache.ignite.spark.{IgniteContext, IgniteRDD}
import org.apache.spark.{SparkConf, SparkContext}

object RDDWriter extends App {
  val conf = new SparkConf().setAppName("RDDWriter")
  val sc = new SparkContext(conf)
  val ic = new IgniteContext(sc,
"/usr/local/apache-ignite-fabric-2.6.0-bin/examples/config/spark/example-shared-rdd.xml")
  val sharedRDD: IgniteRDD[Int, Int] = ic.fromCache("sharedRDD")
  sharedRDD.savePairs(sc.parallelize(1 to 1000, 10).map(i => (i, i)))
  ic.close(true)
  sc.stop()
}

object RDDReader extends App {
  val conf = new SparkConf().setAppName("RDDReader")
  val sc = new SparkContext(conf)
  val ic = new IgniteContext(sc,
"/usr/local/apache-ignite-fabric-2.6.0-bin/examples/config/spark/example-shared-rdd.xml")
  val sharedRDD: IgniteRDD[Int, Int] = ic.fromCache("sharedRDD")
  val greaterThanFiveHundred = sharedRDD.filter(_._2 > 500)
  println("The count is " + greaterThanFiveHundred.count())
  ic.close(true)
  sc.stop()
}

this is result of running:


$SPARK_HOME/bin/spark-submit --class "com.gridgain.RDDWriter" --master
spark://linux-client:7077 ~/spark\ and\ ignite\
issue/ignite-and-spark-integration-master/ignite-rdd/ignite-spark-scala/target/ignite-spark-scala-1.0.jar 
2019-01-05 12:10:44 WARN  Utils:66 - Your hostname, linux-client resolves to
a loopback address: 127.0.1.1; using 192.168.43.225 instead (on interface
wlp3s0)
2019-01-05 12:10:44 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind
to another address
2019-01-05 12:10:46 WARN  NativeCodeLoader:62 - Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
2019-01-05 12:10:48 INFO  SparkContext:54 - Running Spark version 2.4.0
2019-01-05 12:10:48 INFO  SparkContext:54 - Submitted application: RDDWriter
2019-01-05 12:10:48 INFO  SecurityManager:54 - Changing view acls to: mehdi
2019-01-05 12:10:48 INFO  SecurityManager:54 - Changing modify acls to:
mehdi
2019-01-05 12:10:48 INFO  SecurityManager:54 - Changing view acls groups to: 
2019-01-05 12:10:48 INFO  SecurityManager:54 - Changing modify acls groups
to: 
2019-01-05 12:10:48 INFO  SecurityManager:54 - SecurityManager:
authentication disabled; ui acls disabled; users  with view permissions:
Set(mehdi); groups with view permissions: Set(); users  with modify
permissions: Set(mehdi); groups with modify permissions: Set()
2019-01-05 12:10:51 INFO  Utils:54 - Successfully started service
'sparkDriver' on port 42209.
2019-01-05 12:10:51 INFO  SparkEnv:54 - Registering MapOutputTracker
2019-01-05 12:10:51 INFO  SparkEnv:54 - Registering BlockManagerMaster
2019-01-05 12:10:51 INFO  BlockManagerMasterEndpoint:54 - Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
2019-01-05 12:10:51 INFO  BlockManagerMasterEndpoint:54 -
BlockManagerMasterEndpoint up
2019-01-05 12:10:51 INFO  DiskBlockManager:54 - Created local directory at
/tmp/blockmgr-97d7b468-57a8-4fb3-a951-25a6a1312922
2019-01-05 12:10:51 INFO  MemoryStore:54 - MemoryStore started with capacity
366.3 MB
2019-01-05 12:10:51 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2019-01-05 12:10:52 INFO  log:192 - Logging initialized @9014ms
2019-01-05 12:10:52 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build
timestamp: unknown, git hash: unknown
2019-01-05 12:10:52 INFO  Server:419 - Started @9118ms
2019-01-05 12:10:52 INFO  AbstractConnector:278 - Started
ServerConnector@456abb66{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-01-05 12:10:52 INFO  Utils:54 - Successfully started service 'SparkUI'
on port 4040.
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@77e80a5e{/jobs,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1654a892{/jobs/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2577d6c8{/jobs/job,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6c000e0c{/jobs/job/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5f233b26{/stages,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@44f9779c{/stages/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6974a715{/stages/stage,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4c9e9fb8{/stages/stage/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@9ec531{/stages/pool,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@40147317{/stages/pool/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@210f0cc1{/storage,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@19542407{/storage/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6f95cd51{/storage/rdd,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@c7a977f{/storage/rdd/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@55caeb35{/environment,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6d868997{/environment/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2c383e33{/executors,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@74a195a4{/executors/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@76304b46{/executors/threadDump,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2fa3be26{/executors/threadDump/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4287d447{/static,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5a4ed68f{/,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@367795c7{/api,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1fd386c3{/jobs/job/kill,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@edf4f36{/stages/stage/kill,null,AVAILABLE,@Spark}
2019-01-05 12:10:52 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started
at http://192.168.43.225:4040
<http://apache-ignite-users.70518.x6.nabble.com/file/t2160/Screenshot_from_2019-01-05_12-21-06.png> 
<http://apache-ignite-users.70518.x6.nabble.com/file/t2160/Screenshot_from_2019-01-05_12-21-06.png> 
<http://apache-ignite-users.70518.x6.nabble.com/file/t2160/Screenshot_from_2019-01-05_12-21-06.png> 
2019-01-05 12:10:52 INFO  SparkContext:54 - Added JAR
file:/home/mehdi/spark%20and%20ignite%20issue/ignite-and-spark-integration-master/ignite-rdd/ignite-spark-scala/target/ignite-spark-scala-1.0.jar
at spark://192.168.43.225:42209/jars/ignite-spark-scala-1.0.jar with
timestamp 1546677652287
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Connecting
to master spark://linux-client:7077...
2019-01-05 12:10:52 INFO  TransportClientFactory:267 - Successfully created
connection to linux-client/127.0.1.1:7077 after 124 ms (0 ms spent in
bootstraps)
2019-01-05 12:10:52 INFO  StandaloneSchedulerBackend:54 - Connected to Spark
cluster with app ID app-20190105121052-0005
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105121052-0005/0 on worker-20190105103259-127.0.1.1-43911
(127.0.1.1:43911) with 2 core(s)
2019-01-05 12:10:52 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105121052-0005/0 on hostPort 127.0.1.1:43911 with 2 core(s),
512.0 MB RAM
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105121052-0005/1 on worker-20190105103304-127.0.1.1-44569
(127.0.1.1:44569) with 2 core(s)
2019-01-05 12:10:52 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105121052-0005/1 on hostPort 127.0.1.1:44569 with 2 core(s),
512.0 MB RAM
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105121052-0005/2 on worker-20190105103301-127.0.1.1-34465
(127.0.1.1:34465) with 2 core(s)
2019-01-05 12:10:52 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105121052-0005/2 on hostPort 127.0.1.1:34465 with 2 core(s),
512.0 MB RAM
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105121052-0005/3 on worker-20190105103256-127.0.1.1-46653
(127.0.1.1:46653) with 2 core(s)
2019-01-05 12:10:52 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105121052-0005/3 on hostPort 127.0.1.1:46653 with 2 core(s),
512.0 MB RAM
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105121052-0005/0 is now RUNNING
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105121052-0005/1 is now RUNNING
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105121052-0005/2 is now RUNNING
2019-01-05 12:10:52 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105121052-0005/3 is now RUNNING
2019-01-05 12:10:53 INFO  Utils:54 - Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 43793.
2019-01-05 12:10:53 INFO  NettyBlockTransferService:54 - Server created on
192.168.43.225:43793
2019-01-05 12:10:53 INFO  BlockManager:54 - Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
2019-01-05 12:10:53 INFO  BlockManagerMaster:54 - Registering BlockManager
BlockManagerId(driver, 192.168.43.225, 43793, None)
2019-01-05 12:10:53 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 192.168.43.225:43793 with 366.3 MB RAM, BlockManagerId(driver,
192.168.43.225, 43793, None)
2019-01-05 12:10:53 INFO  BlockManagerMaster:54 - Registered BlockManager
BlockManagerId(driver, 192.168.43.225, 43793, None)
2019-01-05 12:10:53 INFO  BlockManager:54 - Initialized BlockManager:
BlockManagerId(driver, 192.168.43.225, 43793, None)
2019-01-05 12:10:53 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@7573e12f{/metrics/json,null,AVAILABLE,@Spark}
2019-01-05 12:10:53 INFO  EventLoggingListener:54 - Logging events to
file:/tmp/spark-events/app-20190105121052-0005
2019-01-05 12:10:54 INFO  StandaloneSchedulerBackend:54 - SchedulerBackend
is ready for scheduling beginning after reached minRegisteredResourcesRatio:
0.0
2019-01-05 12:10:54 INFO  XmlBeanDefinitionReader:317 - Loading XML bean
definitions from URL
[file:/usr/local/apache-ignite-fabric-2.6.0-bin/examples/config/spark/example-shared-rdd.xml]
2019-01-05 12:10:55 INFO  GenericApplicationContext:583 - Refreshing
org.springframework.context.support.GenericApplicationContext@45c2e0a6:
startup date [Sat Jan 05 12:10:55 IRST 2019]; root of context hierarchy
Can't load log handler "org.apache.ignite.logger.java.JavaLoggerFileHandler"
java.lang.ClassNotFoundException:
org.apache.ignite.logger.java.JavaLoggerFileHandler
java.lang.ClassNotFoundException:
org.apache.ignite.logger.java.JavaLoggerFileHandler
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.util.logging.LogManager$5.run(LogManager.java:965)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.util.logging.LogManager.loadLoggerHandlers(LogManager.java:958)
	at
java.util.logging.LogManager.initializeGlobalHandlers(LogManager.java:1578)
	at java.util.logging.LogManager.access$1500(LogManager.java:145)
	at
java.util.logging.LogManager$RootLogger.accessCheckedHandlers(LogManager.java:1667)
	at java.util.logging.Logger.getHandlers(Logger.java:1777)
	at
org.apache.ignite.logger.java.JavaLogger.findHandler(JavaLogger.java:411)
	at org.apache.ignite.logger.java.JavaLogger.configure(JavaLogger.java:241)
	at org.apache.ignite.logger.java.JavaLogger.<init>(JavaLogger.java:181)
	at org.apache.ignite.logger.java.JavaLogger.<init>(JavaLogger.java:135)
	at
org.apache.ignite.internal.LongJVMPauseDetector.<clinit>(LongJVMPauseDetector.java:44)
	at org.apache.ignite.internal.IgniteKernal.<clinit>(IgniteKernal.java:300)
	at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:2009)
	at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1723)
	at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:1151)
	at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:671)
	at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:611)
	at org.apache.ignite.Ignition.getOrStart(Ignition.java:419)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
	at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:63)
	at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:99)
	at
com.gridgain.RDDWriter$.delayedEndpoint$com$gridgain$RDDWriter$1(SparkIgniteTest.scala:26)
	at com.gridgain.RDDWriter$delayedInit$body.apply(SparkIgniteTest.scala:23)
	at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
	at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
	at scala.App$$anonfun$main$1.apply(App.scala:76)
	at scala.App$$anonfun$main$1.apply(App.scala:76)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
	at scala.App$class.main(App.scala:76)
	at com.gridgain.RDDWriter$.main(SparkIgniteTest.scala:23)
	at com.gridgain.RDDWriter.main(SparkIgniteTest.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2019-01-05 12:10:56 INFO  IgniteKernal:566 - 

>>>    __________  ________________  
>>>   /  _/ ___/ |/ /  _/_  __/ __/  
>>>  _/ // (7 7    // /  / / / _/    
>>> /___/\___/_/|_/___/ /_/ /___/   
>>> 
>>> ver. 2.6.0#19700101-sha1:DEV
>>> 2018 Copyright(C) Apache Software Foundation
>>> 
>>> Ignite documentation: http://ignite.apache.org

2019-01-05 12:10:56 INFO  IgniteKernal:566 - Config URL: n/a
2019-01-05 12:10:56 INFO  IgniteKernal:566 - IgniteConfiguration
[igniteInstanceName=null, pubPoolSize=8, svcPoolSize=8, callbackPoolSize=8,
stripedPoolSize=8, sysPoolSize=8, mgmtPoolSize=4, igfsPoolSize=8,
dataStreamerPoolSize=8, utilityCachePoolSize=8,
utilityCacheKeepAliveTime=60000, p2pPoolSize=2, qryPoolSize=8,
igniteHome=/usr/local/apache-ignite-fabric-2.6.0-bin/,
igniteWorkDir=/usr/local/apache-ignite-fabric-2.6.0-bin/work,
mbeanSrv=com.sun.jmx.mbeanserver.JmxMBeanServer@149b0577,
nodeId=cbe21a82-2837-46ac-bcaa-12d5acf020a8,
marsh=org.apache.ignite.internal.binary.BinaryMarshaller@4d0e54e0,
marshLocJobs=false, daemon=false, p2pEnabled=false, netTimeout=5000,
sndRetryDelay=1000, sndRetryCnt=3, metricsHistSize=10000,
metricsUpdateFreq=2000, metricsExpTime=9223372036854775807,
discoSpi=TcpDiscoverySpi [addrRslvr=null, sockTimeout=0, ackTimeout=0,
marsh=null, reconCnt=10, reconDelay=2000, maxAckTimeout=600000,
forceSrvMode=false, clientReconnectDisabled=false, internalLsnr=null],
segPlc=STOP, segResolveAttempts=2, waitForSegOnStart=true,
allResolversPassReq=true, segChkFreq=10000, commSpi=TcpCommunicationSpi
[connectGate=null, connPlc=null, enableForcibleNodeKill=false,
enableTroubleshootingLog=false,
srvLsnr=org.apache.ignite.spi.communication.tcp.TcpCommunicationSpi$2@7cac93fe,
locAddr=null, locHost=null, locPort=47100, locPortRange=100, shmemPort=-1,
directBuf=true, directSndBuf=false, idleConnTimeout=600000,
connTimeout=5000, maxConnTimeout=600000, reconCnt=10, sockSndBuf=32768,
sockRcvBuf=32768, msgQueueLimit=0, slowClientQueueLimit=0, nioSrvr=null,
shmemSrv=null, usePairedConnections=false, connectionsPerNode=1,
tcpNoDelay=true, filterReachableAddresses=false, ackSndThreshold=32,
unackedMsgsBufSize=0, sockWriteTimeout=2000, lsnr=null, boundTcpPort=-1,
boundTcpShmemPort=-1, selectorsCnt=4, selectorSpins=0, addrRslvr=null,
ctxInitLatch=java.util.concurrent.CountDownLatch@479b5066[Count = 1],
stopping=false,
metricsLsnr=org.apache.ignite.spi.communication.tcp.TcpCommunicationMetricsListener@64deb58f],
evtSpi=org.apache.ignite.spi.eventstorage.NoopEventStorageSpi@2b1cd7bc,
colSpi=NoopCollisionSpi [], deploySpi=LocalDeploymentSpi [lsnr=null],
indexingSpi=org.apache.ignite.spi.indexing.noop.NoopIndexingSpi@7d17ee50,
addrRslvr=null, clientMode=true, rebalanceThreadPoolSize=1,
txCfg=org.apache.ignite.configuration.TransactionConfiguration@44fdce3c,
cacheSanityCheckEnabled=true, discoStartupDelay=60000, deployMode=SHARED,
p2pMissedCacheSize=100, locHost=null, timeSrvPortBase=31100,
timeSrvPortRange=100, failureDetectionTimeout=10000,
clientFailureDetectionTimeout=30000, metricsLogFreq=60000, hadoopCfg=null,
connectorCfg=org.apache.ignite.configuration.ConnectorConfiguration@712c5463,
odbcCfg=null, warmupClos=null, atomicCfg=AtomicConfiguration
[seqReserveSize=1000, cacheMode=PARTITIONED, backups=1, aff=null,
grpName=null], classLdr=null, sslCtxFactory=null, platformCfg=null,
binaryCfg=null, memCfg=null, pstCfg=null, dsCfg=null, activeOnStart=true,
autoActivation=true, longQryWarnTimeout=3000, sqlConnCfg=null,
cliConnCfg=ClientConnectorConfiguration [host=null, port=10800,
portRange=100, sockSndBufSize=0, sockRcvBufSize=0, tcpNoDelay=true,
maxOpenCursorsPerConn=128, threadPoolSize=8, idleTimeout=0,
jdbcEnabled=true, odbcEnabled=true, thinCliEnabled=true, sslEnabled=false,
useIgniteSslCtxFactory=true, sslClientAuth=false, sslCtxFactory=null],
authEnabled=false, failureHnd=null, commFailureRslvr=null]
2019-01-05 12:10:56 INFO  IgniteKernal:566 - Daemon mode: off
2019-01-05 12:10:56 INFO  IgniteKernal:566 - OS: Linux 4.15.0-43-generic
amd64
2019-01-05 12:10:56 INFO  IgniteKernal:566 - OS user: mehdi
2019-01-05 12:10:56 INFO  IgniteKernal:566 - PID: 10091
2019-01-05 12:10:56 INFO  IgniteKernal:566 - Language runtime: Scala ver.
2.11.12
2019-01-05 12:10:56 INFO  IgniteKernal:566 - VM information: Java(TM) SE
Runtime Environment 1.8.0_192-ea-b04 Oracle Corporation Java HotSpot(TM)
64-Bit Server VM 25.192-b04
2019-01-05 12:10:56 INFO  IgniteKernal:566 - VM total memory: 0.89GB
2019-01-05 12:10:56 INFO  IgniteKernal:566 - Remote Management [restart:
off, REST: off, JMX (remote: off)]
2019-01-05 12:10:56 INFO  IgniteKernal:566 - Logger: Log4JLogger
[quiet=false, config=null]
2019-01-05 12:10:56 INFO  IgniteKernal:566 -
IGNITE_HOME=/usr/local/apache-ignite-fabric-2.6.0-bin/
2019-01-05 12:10:56 INFO  IgniteKernal:566 - VM arguments: [-Xmx1g]
2019-01-05 12:10:56 INFO  IgniteKernal:566 - Configured caches [in
'sysMemPlc' dataRegion: ['ignite-sys-cache'], in 'null' dataRegion:
['sharedRDD']]
2019-01-05 12:10:56 INFO  IgniteKernal:566 - 3-rd party licenses can be
found at: /usr/local/apache-ignite-fabric-2.6.0-bin//libs/licenses
2019-01-05 12:10:56 WARN  GridDiagnostic:571 - Initial heap size is 126MB
(should be no less than 512MB, use -Xms512m -Xmx512m).
2019-01-05 12:10:57 INFO  IgnitePluginProcessor:566 - Configured plugins:
2019-01-05 12:10:57 INFO  IgnitePluginProcessor:566 -   ^-- None
2019-01-05 12:10:57 INFO  IgnitePluginProcessor:566 - 
2019-01-05 12:10:57 INFO  FailureProcessor:566 - Configured failure handler:
[hnd=StopNodeOrHaltFailureHandler [tryStop=false, timeout=0]]
2019-01-05 12:10:57 INFO  TcpCommunicationSpi:566 - Successfully bound
communication NIO server to TCP port [port=47101, locHost=0.0.0.0/0.0.0.0,
selectorsCnt=4, selectorSpins=0, pairedConn=false]
2019-01-05 12:10:57 WARN  TcpCommunicationSpi:571 - Message queue limit is
set to 0 which may lead to potential OOMEs when running cache operations in
FULL_ASYNC or PRIMARY_SYNC modes due to message queues growth on sender and
receiver sides.
2019-01-05 12:10:57 WARN  NoopCheckpointSpi:571 - Checkpoints are disabled
(to enable configure any GridCheckpointSpi implementation)
2019-01-05 12:10:57 WARN  GridCollisionManager:571 - Collision resolution is
disabled (all jobs will be activated upon arrival).
2019-01-05 12:10:57 INFO  IgniteKernal:566 - Security status
[authentication=off, tls/ssl=off]
2019-01-05 12:10:58 INFO  ClientListenerProcessor:566 - Client connector
processor has started on TCP port 10801
2019-01-05 12:10:58 INFO  GridRestProcessor:566 - REST protocols do not
start on client node. To start the protocols on client node set
'-DIGNITE_REST_START_ON_CLIENT=true' system property.
2019-01-05 12:10:59 INFO  IgniteKernal:566 - Non-loopback local IPs:
10.253.27.88, 192.168.43.225, fe80:0:0:0:92d3:3bf1:ca5a:da00%wlp3s0,
fe80:0:0:0:d421:f9ff:fed4:af7b%kvnet
2019-01-05 12:10:59 INFO  IgniteKernal:566 - Enabled local MACs:
742F68385EAA, D621F9D4AF7B
2019-01-05 12:10:59 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(192.168.43.225:38618) with ID 0
2019-01-05 12:10:59 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(192.168.43.225:38620) with ID 3
2019-01-05 12:11:00 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(192.168.43.225:38622) with ID 1
2019-01-05 12:11:00 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(192.168.43.225:38624) with ID 2
2019-01-05 12:11:00 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:41157 with 127.2 MB RAM, BlockManagerId(0, 127.0.1.1,
41157, None)
2019-01-05 12:11:00 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:41349 with 127.2 MB RAM, BlockManagerId(3, 127.0.1.1,
41349, None)
2019-01-05 12:11:00 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:42461 with 127.2 MB RAM, BlockManagerId(2, 127.0.1.1,
42461, None)
2019-01-05 12:11:00 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:40389 with 127.2 MB RAM, BlockManagerId(1, 127.0.1.1,
40389, None)
2019-01-05 12:11:01 INFO  time:566 - Started exchange init
[topVer=AffinityTopologyVersion [topVer=8, minorTopVer=0], crd=false,
evt=NODE_JOINED, evtNode=cbe21a82-2837-46ac-bcaa-12d5acf020a8,
customEvt=null, allowMerge=true]
2019-01-05 12:11:01 INFO  GridCacheProcessor:566 - Started cache
[name=ignite-sys-cache, id=-2100569601, memoryPolicyName=sysMemPlc,
mode=REPLICATED, atomicity=TRANSACTIONAL, backups=2147483647]
2019-01-05 12:11:01 INFO  TcpCommunicationSpi:566 - Established outgoing
communication connection [locAddr=/0:0:0:0:0:0:0:1:43518,
rmtAddr=/0:0:0:0:0:0:0:1%lo:47100]
2019-01-05 12:11:01 INFO  GridCacheProcessor:566 - Started cache
[name=sharedRDD, id=-1581581875, memoryPolicyName=null, mode=PARTITIONED,
atomicity=ATOMIC, backups=1]
2019-01-05 12:11:01 INFO  time:566 - Finished exchange init
[topVer=AffinityTopologyVersion [topVer=8, minorTopVer=0], crd=false]
2019-01-05 12:11:01 INFO  GridDhtPartitionsExchangeFuture:566 - Received
full message, will finish exchange
[node=0a01ccfc-6d3a-4490-bdd8-90cf3b71928d, resVer=AffinityTopologyVersion
[topVer=8, minorTopVer=0]]
2019-01-05 12:11:01 INFO  GridDhtPartitionsExchangeFuture:566 - Finish
exchange future [startVer=AffinityTopologyVersion [topVer=8, minorTopVer=0],
resVer=AffinityTopologyVersion [topVer=8, minorTopVer=0], err=null]
2019-01-05 12:11:01 INFO  IgniteKernal:566 - Performance suggestions for
grid  (fix if possible)
2019-01-05 12:11:01 INFO  IgniteKernal:566 - To disable, set
-DIGNITE_PERFORMANCE_SUGGESTIONS_DISABLED=true
2019-01-05 12:11:01 INFO  IgniteKernal:566 -   ^-- Enable G1 Garbage
Collector (add '-XX:+UseG1GC' to JVM options)
2019-01-05 12:11:01 INFO  IgniteKernal:566 -   ^-- Set max direct memory
size if getting 'OOME: Direct buffer memory' (add
'-XX:MaxDirectMemorySize=<size>[g|G|m|M|k|K]' to JVM options)
2019-01-05 12:11:01 INFO  IgniteKernal:566 -   ^-- Disable processing of
calls to System.gc() (add '-XX:+DisableExplicitGC' to JVM options)
2019-01-05 12:11:01 INFO  IgniteKernal:566 -   ^-- Decrease number of
backups (set 'backups' to 0)
2019-01-05 12:11:01 INFO  IgniteKernal:566 - Refer to this page for more
performance suggestions:
https://apacheignite.readme.io/docs/jvm-and-system-tuning
2019-01-05 12:11:01 INFO  IgniteKernal:566 - 
2019-01-05 12:11:01 INFO  IgniteKernal:566 - To start Console Management &
Monitoring run ignitevisorcmd.{sh|bat}
2019-01-05 12:11:01 INFO  IgniteKernal:566 - 
2019-01-05 12:11:01 INFO  IgniteKernal:566 - 

>>> +---------------------------------+
>>> Ignite ver. 2.6.0#19700101-sha1:DEV
>>> +---------------------------------+
>>> OS name: Linux 4.15.0-43-generic amd64
>>> CPU(s): 8
>>> Heap: 0.89GB
>>> VM name: 10091@linux-client
>>> Local node [ID=CBE21A82-2837-46AC-BCAA-12D5ACF020A8, order=8,
>>> clientMode=true]
>>> Local node addresses: [10.253.27.88/0:0:0:0:0:0:0:1%lo,
>>> 192.168.43.225/10.253.27.88, /127.0.0.1, /192.168.43.225]
>>> Local ports: TCP:10801 TCP:47101 UDP:47400 

2019-01-05 12:11:01 INFO  GridDiscoveryManager:566 - Topology snapshot
[ver=8, servers=1, clients=1, CPUs=16, offheap=1.6GB, heap=1.9GB]
2019-01-05 12:11:01 INFO  GridDiscoveryManager:566 -   ^-- Node
[id=CBE21A82-2837-46AC-BCAA-12D5ACF020A8, clusterState=ACTIVE]
2019-01-05 12:11:02 INFO  SparkContext:54 - Starting job: foreachPartition
at IgniteRDD.scala:233
2019-01-05 12:11:02 INFO  DAGScheduler:54 - Got job 0 (foreachPartition at
IgniteRDD.scala:233) with 10 output partitions
2019-01-05 12:11:02 INFO  DAGScheduler:54 - Final stage: ResultStage 0
(foreachPartition at IgniteRDD.scala:233)
2019-01-05 12:11:02 INFO  DAGScheduler:54 - Parents of final stage: List()
2019-01-05 12:11:02 INFO  DAGScheduler:54 - Missing parents: List()
2019-01-05 12:11:02 INFO  DAGScheduler:54 - Submitting ResultStage 0
(MapPartitionsRDD[2] at map at SparkIgniteTest.scala:28), which has no
missing parents
2019-01-05 12:11:02 INFO  MemoryStore:54 - Block broadcast_0 stored as
values in memory (estimated size 4.6 KB, free 366.3 MB)
2019-01-05 12:11:02 INFO  MemoryStore:54 - Block broadcast_0_piece0 stored
as bytes in memory (estimated size 2.6 KB, free 366.3 MB)
2019-01-05 12:11:02 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 192.168.43.225:43793 (size: 2.6 KB, free: 366.3 MB)
2019-01-05 12:11:02 INFO  SparkContext:54 - Created broadcast 0 from
broadcast at DAGScheduler.scala:1161
2019-01-05 12:11:02 INFO  DAGScheduler:54 - Submitting 10 missing tasks from
ResultStage 0 (MapPartitionsRDD[2] at map at SparkIgniteTest.scala:28)
(first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2019-01-05 12:11:02 INFO  TaskSchedulerImpl:54 - Adding task set 0.0 with 10
tasks
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 0.0 in stage 0.0
(TID 0, 127.0.1.1, executor 2, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 1.0 in stage 0.0
(TID 1, 127.0.1.1, executor 1, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 2.0 in stage 0.0
(TID 2, 127.0.1.1, executor 3, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 3.0 in stage 0.0
(TID 3, 127.0.1.1, executor 0, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 4.0 in stage 0.0
(TID 4, 127.0.1.1, executor 2, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 5.0 in stage 0.0
(TID 5, 127.0.1.1, executor 1, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 6.0 in stage 0.0
(TID 6, 127.0.1.1, executor 3, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:02 INFO  TaskSetManager:54 - Starting task 7.0 in stage 0.0
(TID 7, 127.0.1.1, executor 0, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:03 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:41349 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 12:11:03 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:42461 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 12:11:03 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:40389 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 12:11:03 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:41157 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 8.0 in stage 0.0
(TID 8, 127.0.1.1, executor 3, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 9.0 in stage 0.0
(TID 9, 127.0.1.1, executor 2, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 12:11:05 WARN  TaskSetManager:66 - Lost task 1.0 in stage 0.0
(TID 1, 127.0.1.1, executor 1): java.lang.NoClassDefFoundError: Could not
initialize class org.apache.ignite.internal.util.IgniteUtils
	at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:121)
	at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
	at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
	at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:844)

2019-01-05 12:11:05 WARN  TaskSetManager:66 - Lost task 4.0 in stage 0.0
(TID 4, 127.0.1.1, executor 2): java.lang.ExceptionInInitializerError
	at
org.apache.ignite.internal.util.IgniteUtils.<clinit>(IgniteUtils.java:769)
	at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:121)
	at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
	at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
	at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:844)
Caused by: java.lang.RuntimeException: jdk.internal.misc.JavaNioAccess class
is unavailable.
	at
org.apache.ignite.internal.util.GridUnsafe.javaNioAccessObject(GridUnsafe.java:1453)
	at org.apache.ignite.internal.util.GridUnsafe.<clinit>(GridUnsafe.java:112)
	... 17 more
Caused by: java.lang.IllegalAccessException: class
org.apache.ignite.internal.util.GridUnsafe cannot access class
jdk.internal.misc.SharedSecrets (in module java.base) because module
java.base does not export jdk.internal.misc to unnamed module @3eee9730
	at
java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:360)
	at
java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:589)
	at java.base/java.lang.reflect.Method.invoke(Method.java:556)
	at
org.apache.ignite.internal.util.GridUnsafe.javaNioAccessObject(GridUnsafe.java:1450)
	... 18 more

2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 4.1 in stage 0.0
(TID 10, 127.0.1.1, executor 0, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 1.1 in stage 0.0
(TID 11, 127.0.1.1, executor 0, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 6.0 in stage 0.0
(TID 6) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 1]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 0.0 in stage 0.0
(TID 0) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 2]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 8.0 in stage 0.0
(TID 8) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 3]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 9.0 in stage 0.0
(TID 9) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 4]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 9.1 in stage 0.0
(TID 12, 127.0.1.1, executor 2, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 8.1 in stage 0.0
(TID 13, 127.0.1.1, executor 3, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 0.1 in stage 0.0
(TID 14, 127.0.1.1, executor 1, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 6.1 in stage 0.0
(TID 15, 127.0.1.1, executor 2, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 3.0 in stage 0.0
(TID 3) on 127.0.1.1, executor 0: java.lang.ExceptionInInitializerError
(null) [duplicate 1]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 2.0 in stage 0.0
(TID 2) on 127.0.1.1, executor 3: java.lang.ExceptionInInitializerError
(null) [duplicate 2]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 7.0 in stage 0.0
(TID 7) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 5]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 5.0 in stage 0.0
(TID 5) on 127.0.1.1, executor 1: java.lang.ExceptionInInitializerError
(null) [duplicate 3]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 5.1 in stage 0.0
(TID 16, 127.0.1.1, executor 3, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 7.1 in stage 0.0
(TID 17, 127.0.1.1, executor 1, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 2.1 in stage 0.0
(TID 18, 127.0.1.1, executor 0, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 1.1 in stage 0.0
(TID 11) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 6]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 1.2 in stage 0.0
(TID 19, 127.0.1.1, executor 3, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 5.1 in stage 0.0
(TID 16) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 7]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 0.1 in stage 0.0
(TID 14) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 8]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 0.2 in stage 0.0
(TID 20, 127.0.1.1, executor 1, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 5.2 in stage 0.0
(TID 21, 127.0.1.1, executor 1, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 3.1 in stage 0.0
(TID 22, 127.0.1.1, executor 2, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 7.1 in stage 0.0
(TID 17) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 9]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 6.1 in stage 0.0
(TID 15) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 10]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 6.2 in stage 0.0
(TID 23, 127.0.1.1, executor 2, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 9.1 in stage 0.0
(TID 12) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 11]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 9.2 in stage 0.0
(TID 24, 127.0.1.1, executor 0, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 7.2 in stage 0.0
(TID 25, 127.0.1.1, executor 3, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 4.1 in stage 0.0
(TID 10) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 12]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 8.1 in stage 0.0
(TID 13) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 13]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 8.2 in stage 0.0
(TID 26, 127.0.1.1, executor 0, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 2.1 in stage 0.0
(TID 18) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 14]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 2.2 in stage 0.0
(TID 27, 127.0.1.1, executor 3, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 1.2 in stage 0.0
(TID 19) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 15]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 3.1 in stage 0.0
(TID 22) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 16]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 3.2 in stage 0.0
(TID 28, 127.0.1.1, executor 2, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 7.2 in stage 0.0
(TID 25) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 17]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 7.3 in stage 0.0
(TID 29, 127.0.1.1, executor 3, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 1.3 in stage 0.0
(TID 30, 127.0.1.1, executor 1, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 4.2 in stage 0.0
(TID 31, 127.0.1.1, executor 1, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 0.2 in stage 0.0
(TID 20) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 18]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 5.2 in stage 0.0
(TID 21) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 19]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 5.3 in stage 0.0
(TID 32, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 0.3 in stage 0.0
(TID 33, 127.0.1.1, executor 2, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 9.2 in stage 0.0
(TID 24) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 20]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 6.2 in stage 0.0
(TID 23) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 21]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 6.3 in stage 0.0
(TID 34, 127.0.1.1, executor 0, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 8.2 in stage 0.0
(TID 26) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 22]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 8.3 in stage 0.0
(TID 35, 127.0.1.1, executor 1, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Starting task 9.3 in stage 0.0
(TID 36, 127.0.1.1, executor 3, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 7.3 in stage 0.0
(TID 29) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 23]
2019-01-05 12:11:05 ERROR TaskSetManager:70 - Task 7 in stage 0.0 failed 4
times; aborting job
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 4.2 in stage 0.0
(TID 31) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 24]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 0.3 in stage 0.0
(TID 33) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 25]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 3.2 in stage 0.0
(TID 28) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 26]
2019-01-05 12:11:05 INFO  TaskSchedulerImpl:54 - Cancelling stage 0
2019-01-05 12:11:05 INFO  TaskSchedulerImpl:54 - Killing all running tasks
in stage 0: Stage cancelled
2019-01-05 12:11:05 INFO  TaskSchedulerImpl:54 - Stage 0 was cancelled
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 2.2 in stage 0.0
(TID 27) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 27]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 1.3 in stage 0.0
(TID 30) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 28]
2019-01-05 12:11:05 INFO  TaskSetManager:54 - Lost task 5.3 in stage 0.0
(TID 32) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 29]
2019-01-05 12:11:05 INFO  DAGScheduler:54 - ResultStage 0 (foreachPartition
at IgniteRDD.scala:233) failed in 3.210 s due to Job aborted due to stage
failure: Task 7 in stage 0.0 failed 4 times, most recent failure: Lost task
7.3 in stage 0.0 (TID 29, 127.0.1.1, executor 3):
java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
	at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:121)
	at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
	at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
	at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:844)

Driver stacktrace:
2019-01-05 12:11:06 INFO  TaskSetManager:54 - Lost task 8.3 in stage 0.0
(TID 35) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 30]
2019-01-05 12:11:06 INFO  DAGScheduler:54 - Job 0 failed: foreachPartition
at IgniteRDD.scala:233, took 3.386498 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 7 in stage 0.0 failed 4 times, most recent failure:
Lost task 7.3 in stage 0.0 (TID 29, 127.0.1.1, executor 3):
java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
	at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:121)
	at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
	at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
	at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:844)

Driver stacktrace:
	at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
	at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
	at scala.Option.foreach(Option.scala:257)
	at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:935)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:933)
	at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
	at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:933)
	at org.apache.ignite.spark.IgniteRDD.savePairs(IgniteRDD.scala:233)
	at
com.gridgain.RDDWriter$.delayedEndpoint$com$gridgain$RDDWriter$1(SparkIgniteTest.scala:28)
	at com.gridgain.RDDWriter$delayedInit$body.apply(SparkIgniteTest.scala:23)
	at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
	at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
	at scala.App$$anonfun$main$1.apply(App.scala:76)
	at scala.App$$anonfun$main$1.apply(App.scala:76)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
	at scala.App$class.main(App.scala:76)
	at com.gridgain.RDDWriter$.main(SparkIgniteTest.scala:23)
	at com.gridgain.RDDWriter.main(SparkIgniteTest.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
	at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
	at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:121)
	at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.lang.Thread.run(Thread.java:844)
2019-01-05 12:11:06 INFO  TaskSetManager:54 - Lost task 9.3 in stage 0.0
(TID 36) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 31]
2019-01-05 12:11:06 INFO  TaskSetManager:54 - Lost task 6.3 in stage 0.0
(TID 34) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 32]
2019-01-05 12:11:06 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool 
2019-01-05 12:12:02 INFO  IgniteKernal:566 - 
Metrics for local node (to disable set 'metricsLogFrequency' to 0)
    ^-- Node [id=cbe21a82, uptime=00:01:00.010]
    ^-- H/N/C [hosts=2, nodes=2, CPUs=16]
    ^-- CPU [cur=0.2%, avg=0.81%, GC=0%]
    ^-- PageMemory [pages=0]
    ^-- Heap [used=336MB, free=63.06%, comm=503MB]
    ^-- Non heap [used=83MB, free=-1%, comm=85MB]
    ^-- Outbound messages queue [size=0]
    ^-- Public thread pool [active=0, idle=0, qSize=0]
    ^-- System thread pool [active=0, idle=0, qSize=0]
2019-01-05 12:12:11 INFO  GridUpdateNotifier:566 - Update status is not
available.
2019-01-05 12:13:02 INFO  IgniteKernal:566 - 
Metrics for local node (to disable set 'metricsLogFrequency' to 0)
    ^-- Node [id=cbe21a82, uptime=00:02:00.030]
    ^-- H/N/C [hosts=2, nodes=2, CPUs=16]
    ^-- CPU [cur=0.23%, avg=0.57%, GC=0%]




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/