You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by mehdi sey <se...@gmail.com> on 2019/01/15 06:43:04 UTC

read from igniteRDD and write to igniteRDD

hi. as we know we can create igniteRDD for sharing between spark worker. i
want to know how we can read from igniteRDD from spark executor and how to
write to igniteRDD from spark executor. is it possible to share an example?



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: read from igniteRDD and write to igniteRDD

Posted by Mikhail <mi...@gmail.com>.
Hi Mehdi

I think first you need to read the following doc:
https://apacheignite-fs.readme.io/docs/ignitecontext-igniterdd

it describes how to properly setup Ignite cluster and create IgniteRDD, also
it has examples.

if you still have a question after reading the documentation, please
describe your case and what you want to implement in more details.

Thanks,
Mike.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: read from igniteRDD and write to igniteRDD

Posted by mehdi sey <se...@gmail.com>.
i have wrote a piece of code for reading data from ignite cache table but i
encounter following error. 

/usr/lib/jvm/java-8-oracle/bin/java
-javaagent:/snap/intellij-idea-community/113/lib/idea_rt.jar=46131:/snap/intellij-idea-community/113/bin
-Dfile.encoding=UTF-8 -classpath
/usr/lib/jvm/java-8-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-8-oracle/jre/lib/deploy.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/cldrdata.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/dnsns.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/jaccess.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/jfxrt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/localedata.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/nashorn.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunec.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunjce_provider.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunpkcs11.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/zipfs.jar:/usr/lib/jvm/java-8-oracle/jre/lib/javaws.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfxswt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar:/usr/lib/jvm/java-8-oracle/jre/lib/plugin.jar:/usr/lib/jvm/java-8-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-8-oracle/jre/lib/rt.jar:/home/mehdi/apache-ignite-2.6.0-src/examples/target/classes:/home/mehdi/.m2/repository/javax/cache/cache-api/1.0.0/cache-api-1.0.0.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-core/2.6.0/ignite-core-2.6.0.jar:/home/mehdi/.m2/repository/org/jetbrains/annotations/13.0/annotations-13.0.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-spring/2.6.0/ignite-spring-2.6.0.jar:/home/mehdi/.m2/repository/org/springframework/spring-core/4.3.16.RELEASE/spring-core-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/org/springframework/spring-aop/4.3.16.RELEASE/spring-aop-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/org/springframework/spring-beans/4.3.16.RELEASE/spring-beans-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/org/springframework/spring-context/4.3.16.RELEASE/spring-context-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/org/springframework/spring-expression/4.3.16.RELEASE/spring-expression-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/org/springframework/spring-tx/4.3.16.RELEASE/spring-tx-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/org/springframework/spring-jdbc/4.3.16.RELEASE/spring-jdbc-4.3.16.RELEASE.jar:/home/mehdi/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-log4j/2.6.0/ignite-log4j-2.6.0.jar:/home/mehdi/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-indexing/2.6.0/ignite-indexing-2.6.0.jar:/home/mehdi/.m2/repository/commons-codec/commons-codec/1.11/commons-codec-1.11.jar:/home/mehdi/.m2/repository/org/apache/lucene/lucene-core/5.5.2/lucene-core-5.5.2.jar:/home/mehdi/.m2/repository/org/apache/lucene/lucene-analyzers-common/5.5.2/lucene-analyzers-common-5.5.2.jar:/home/mehdi/.m2/repository/org/apache/lucene/lucene-queryparser/5.5.2/lucene-queryparser-5.5.2.jar:/home/mehdi/.m2/repository/org/apache/lucene/lucene-queries/5.5.2/lucene-queries-5.5.2.jar:/home/mehdi/.m2/repository/org/apache/lucene/lucene-sandbox/5.5.2/lucene-sandbox-5.5.2.jar:/home/mehdi/.m2/repository/com/h2database/h2/1.4.195/h2-1.4.195.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-ml/2.6.0/ignite-ml-2.6.0.jar:/home/mehdi/.m2/repository/it/unimi/dsi/fastutil/7.0.12/fastutil-7.0.12.jar:/home/mehdi/.m2/repository/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar:/home/mehdi/.m2/repository/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-rng-core/1.0/commons-rng-core-1.0.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-rng-client-api/1.0/commons-rng-client-api-1.0.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-rng-simple/1.0/commons-rng-simple-1.0.jar:/home/mehdi/.m2/repository/com/zaxxer/SparseBitSet/1.0/SparseBitSet-1.0.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-spring-data/2.6.0/ignite-spring-data-2.6.0.jar:/home/mehdi/.m2/repository/org/springframework/data/spring-data-commons/1.13.11.RELEASE/spring-data-commons-1.13.11.RELEASE.jar:/home/mehdi/.m2/repository/org/slf4j/slf4j-api/1.7.25/slf4j-api-1.7.25.jar:/home/mehdi/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/mehdi/.m2/repository/com/google/code/simple-spring-memcached/spymemcached/2.8.4/spymemcached-2.8.4.jar:/usr/lib/jvm/java-8-oracle/lib/tools.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-core_2.11/2.4.0/spark-core_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/home/mehdi/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/mehdi/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/mehdi/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-compress/1.8.1/commons-compress-1.8.1.jar:/home/mehdi/.m2/repository/org/tukaani/xz/1.5/xz-1.5.jar:/home/mehdi/.m2/repository/org/apache/avro/avro-mapred/1.8.2/avro-mapred-1.8.2.jar:/home/mehdi/.m2/repository/org/apache/avro/avro-ipc/1.8.2/avro-ipc-1.8.2.jar:/home/mehdi/.m2/repository/commons-codec/commons-codec/1.9/commons-codec-1.9.jar:/home/mehdi/.m2/repository/com/twitter/chill_2.11/0.9.3/chill_2.11-0.9.3.jar:/home/mehdi/.m2/repository/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/home/mehdi/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/home/mehdi/.m2/repository/org/objenesis/objenesis/2.5.1/objenesis-2.5.1.jar:/home/mehdi/.m2/repository/com/twitter/chill-java/0.9.3/chill-java-0.9.3.jar:/home/mehdi/.m2/repository/org/apache/xbean/xbean-asm6-shaded/4.8/xbean-asm6-shaded-4.8.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-client/2.6.5/hadoop-client-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-common/2.6.5/hadoop-common-2.6.5.jar:/home/mehdi/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/mehdi/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/mehdi/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/home/mehdi/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/home/mehdi/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/mehdi/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/mehdi/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/mehdi/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/mehdi/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/mehdi/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/mehdi/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-auth/2.6.5/hadoop-auth-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar:/home/mehdi/.m2/repository/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar:/home/mehdi/.m2/repository/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/home/mehdi/.m2/repository/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/home/mehdi/.m2/repository/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/home/mehdi/.m2/repository/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/home/mehdi/.m2/repository/org/apache/curator/curator-client/2.6.0/curator-client-2.6.0.jar:/home/mehdi/.m2/repository/org/htrace/htrace-core/3.0.4/htrace-core-3.0.4.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.6.5/hadoop-hdfs-2.6.5.jar:/home/mehdi/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/mehdi/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/home/mehdi/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.6.5/hadoop-mapreduce-client-app-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.6.5/hadoop-mapreduce-client-common-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.6.5/hadoop-yarn-client-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.6.5/hadoop-yarn-server-common-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.6.5/hadoop-mapreduce-client-shuffle-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.6.5/hadoop-yarn-api-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.6.5/hadoop-mapreduce-client-core-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.6.5/hadoop-yarn-common-2.6.5.jar:/home/mehdi/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/home/mehdi/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/home/mehdi/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar:/home/mehdi/.m2/repository/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.6.5/hadoop-mapreduce-client-jobclient-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/hadoop/hadoop-annotations/2.6.5/hadoop-annotations-2.6.5.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-launcher_2.11/2.4.0/spark-launcher_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-kvstore_2.11/2.4.0/spark-kvstore_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/home/mehdi/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.6.7/jackson-core-2.6.7.jar:/home/mehdi/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.6.7/jackson-annotations-2.6.7.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-network-common_2.11/2.4.0/spark-network-common_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-network-shuffle_2.11/2.4.0/spark-network-shuffle_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-unsafe_2.11/2.4.0/spark-unsafe_2.11-2.4.0.jar:/home/mehdi/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/home/mehdi/.m2/repository/org/apache/curator/curator-recipes/2.6.0/curator-recipes-2.6.0.jar:/home/mehdi/.m2/repository/org/apache/curator/curator-framework/2.6.0/curator-framework-2.6.0.jar:/home/mehdi/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar:/home/mehdi/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/home/mehdi/.m2/repository/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-lang3/3.5/commons-lang3-3.5.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/mehdi/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/mehdi/.m2/repository/org/slf4j/slf4j-api/1.7.16/slf4j-api-1.7.16.jar:/home/mehdi/.m2/repository/org/slf4j/jul-to-slf4j/1.7.16/jul-to-slf4j-1.7.16.jar:/home/mehdi/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.16/jcl-over-slf4j-1.7.16.jar:/home/mehdi/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar:/home/mehdi/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/home/mehdi/.m2/repository/org/xerial/snappy/snappy-java/1.1.7.1/snappy-java-1.1.7.1.jar:/home/mehdi/.m2/repository/org/lz4/lz4-java/1.4.0/lz4-java-1.4.0.jar:/home/mehdi/.m2/repository/com/github/luben/zstd-jni/1.3.2-2/zstd-jni-1.3.2-2.jar:/home/mehdi/.m2/repository/org/roaringbitmap/RoaringBitmap/0.5.11/RoaringBitmap-0.5.11.jar:/home/mehdi/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/mehdi/.m2/repository/org/scala-lang/scala-library/2.11.12/scala-library-2.11.12.jar:/home/mehdi/.m2/repository/org/json4s/json4s-jackson_2.11/3.5.3/json4s-jackson_2.11-3.5.3.jar:/home/mehdi/.m2/repository/org/json4s/json4s-core_2.11/3.5.3/json4s-core_2.11-3.5.3.jar:/home/mehdi/.m2/repository/org/json4s/json4s-ast_2.11/3.5.3/json4s-ast_2.11-3.5.3.jar:/home/mehdi/.m2/repository/org/json4s/json4s-scalap_2.11/3.5.3/json4s-scalap_2.11-3.5.3.jar:/home/mehdi/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.6/scala-xml_2.11-1.0.6.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/core/jersey-client/2.22.2/jersey-client-2.22.2.jar:/home/mehdi/.m2/repository/javax/ws/rs/javax.ws.rs-api/2.0.1/javax.ws.rs-api-2.0.1.jar:/home/mehdi/.m2/repository/org/glassfish/hk2/hk2-api/2.4.0-b34/hk2-api-2.4.0-b34.jar:/home/mehdi/.m2/repository/org/glassfish/hk2/hk2-utils/2.4.0-b34/hk2-utils-2.4.0-b34.jar:/home/mehdi/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.4.0-b34/aopalliance-repackaged-2.4.0-b34.jar:/home/mehdi/.m2/repository/org/glassfish/hk2/external/javax.inject/2.4.0-b34/javax.inject-2.4.0-b34.jar:/home/mehdi/.m2/repository/org/glassfish/hk2/hk2-locator/2.4.0-b34/hk2-locator-2.4.0-b34.jar:/home/mehdi/.m2/repository/org/javassist/javassist/3.18.1-GA/javassist-3.18.1-GA.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/core/jersey-common/2.22.2/jersey-common-2.22.2.jar:/home/mehdi/.m2/repository/javax/annotation/javax.annotation-api/1.2/javax.annotation-api-1.2.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/bundles/repackaged/jersey-guava/2.22.2/jersey-guava-2.22.2.jar:/home/mehdi/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.1/osgi-resource-locator-1.0.1.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/core/jersey-server/2.22.2/jersey-server-2.22.2.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/media/jersey-media-jaxb/2.22.2/jersey-media-jaxb-2.22.2.jar:/home/mehdi/.m2/repository/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.22.2/jersey-container-servlet-2.22.2.jar:/home/mehdi/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet-core/2.22.2/jersey-container-servlet-core-2.22.2.jar:/home/mehdi/.m2/repository/io/netty/netty-all/4.1.17.Final/netty-all-4.1.17.Final.jar:/home/mehdi/.m2/repository/io/netty/netty/3.9.9.Final/netty-3.9.9.Final.jar:/home/mehdi/.m2/repository/com/clearspring/analytics/stream/2.7.0/stream-2.7.0.jar:/home/mehdi/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.5/metrics-core-3.1.5.jar:/home/mehdi/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.5/metrics-jvm-3.1.5.jar:/home/mehdi/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.5/metrics-json-3.1.5.jar:/home/mehdi/.m2/repository/io/dropwizard/metrics/metrics-graphite/3.1.5/metrics-graphite-3.1.5.jar:/home/mehdi/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.7.1/jackson-databind-2.6.7.1.jar:/home/mehdi/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.11/2.6.7.1/jackson-module-scala_2.11-2.6.7.1.jar:/home/mehdi/.m2/repository/org/scala-lang/scala-reflect/2.11.8/scala-reflect-2.11.8.jar:/home/mehdi/.m2/repository/com/fasterxml/jackson/module/jackson-module-paranamer/2.7.9/jackson-module-paranamer-2.7.9.jar:/home/mehdi/.m2/repository/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/home/mehdi/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/home/mehdi/.m2/repository/net/razorvine/pyrolite/4.13/pyrolite-4.13.jar:/home/mehdi/.m2/repository/net/sf/py4j/py4j/0.10.7/py4j-0.10.7.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-tags_2.11/2.4.0/spark-tags_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/commons/commons-crypto/1.0.0/commons-crypto-1.0.0.jar:/home/mehdi/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-sql_2.11/2.4.0/spark-sql_2.11-2.4.0.jar:/home/mehdi/.m2/repository/com/univocity/univocity-parsers/2.7.3/univocity-parsers-2.7.3.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-sketch_2.11/2.4.0/spark-sketch_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/spark/spark-catalyst_2.11/2.4.0/spark-catalyst_2.11-2.4.0.jar:/home/mehdi/.m2/repository/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar:/home/mehdi/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.1.0/scala-parser-combinators_2.11-1.1.0.jar:/home/mehdi/.m2/repository/org/codehaus/janino/janino/3.0.9/janino-3.0.9.jar:/home/mehdi/.m2/repository/org/codehaus/janino/commons-compiler/3.0.9/commons-compiler-3.0.9.jar:/home/mehdi/.m2/repository/org/antlr/antlr4-runtime/4.7/antlr4-runtime-4.7.jar:/home/mehdi/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/home/mehdi/.m2/repository/org/apache/orc/orc-core/1.5.2/orc-core-1.5.2.jar:/home/mehdi/.m2/repository/org/apache/orc/orc-shims/1.5.2/orc-shims-1.5.2.jar:/home/mehdi/.m2/repository/io/airlift/aircompressor/0.10/aircompressor-0.10.jar:/home/mehdi/.m2/repository/org/apache/orc/orc-mapreduce/1.5.2/orc-mapreduce-1.5.2.jar:/home/mehdi/.m2/repository/org/apache/parquet/parquet-column/1.10.0/parquet-column-1.10.0.jar:/home/mehdi/.m2/repository/org/apache/parquet/parquet-common/1.10.0/parquet-common-1.10.0.jar:/home/mehdi/.m2/repository/org/apache/parquet/parquet-encoding/1.10.0/parquet-encoding-1.10.0.jar:/home/mehdi/.m2/repository/org/apache/parquet/parquet-hadoop/1.10.0/parquet-hadoop-1.10.0.jar:/home/mehdi/.m2/repository/org/apache/parquet/parquet-format/2.4.0/parquet-format-2.4.0.jar:/home/mehdi/.m2/repository/org/apache/parquet/parquet-jackson/1.10.0/parquet-jackson-1.10.0.jar:/home/mehdi/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.6.0/jackson-annotations-2.6.0.jar:/home/mehdi/.m2/repository/org/apache/arrow/arrow-vector/0.10.0/arrow-vector-0.10.0.jar:/home/mehdi/.m2/repository/org/apache/arrow/arrow-format/0.10.0/arrow-format-0.10.0.jar:/home/mehdi/.m2/repository/org/apache/arrow/arrow-memory/0.10.0/arrow-memory-0.10.0.jar:/home/mehdi/.m2/repository/joda-time/joda-time/2.9.9/joda-time-2.9.9.jar:/home/mehdi/.m2/repository/com/carrotsearch/hppc/0.7.2/hppc-0.7.2.jar:/home/mehdi/.m2/repository/com/vlkan/flatbuffers/1.2.0-3f79e055/flatbuffers-1.2.0-3f79e055.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-spark/2.6.0/ignite-spark-2.6.0-sources.jar:/home/mehdi/.m2/repository/org/apache/ignite/ignite-spark/2.3.0/ignite-spark-2.3.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/ignite-spark-2.6.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/hadoop-common-2.6.5.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/scala-library-2.11.8.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/scala-reflect-2.11.8.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/spark-sql_2.11-2.3.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/spark-core_2.11-2.3.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/spark-tags_2.11-2.3.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/json4s-core_2.11-3.5.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/jackson-annotations-2.6.5.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/spark-catalyst_2.11-2.3.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/spark-network-common_2.11-2.3.0.jar:/usr/local/apache-ignite-fabric-2.6.0-bin/libs/optional/ignite-spark/spark-network-shuffle_2.11-2.3.0.jar
org.apache.ignite.examples.sql.IgniteSparkSession
[13:25:02]    __________  ________________ 
[13:25:02]   /  _/ ___/ |/ /  _/_  __/ __/ 
[13:25:02]  _/ // (7 7    // /  / / / _/   
[13:25:02] /___/\___/_/|_/___/ /_/ /___/  
[13:25:02] 
[13:25:02] ver. 2.6.0#19700101-sha1:DEV
[13:25:02] 2018 Copyright(C) Apache Software Foundation
[13:25:02] 
[13:25:02] Ignite documentation: http://ignite.apache.org
[13:25:02] 
[13:25:02] Quiet mode.
[13:25:02]   ^-- Logging to file
'/home/mehdi/apache-ignite-2.6.0-src/work/log/ignite-ed98ec56.log'
[13:25:02]   ^-- Logging by 'Log4JLogger [quiet=true,
config=/home/mehdi/apache-ignite-2.6.0-src/config/ignite-log4j.xml]'
[13:25:02]   ^-- To see **FULL** console log here add -DIGNITE_QUIET=false
or "-v" to ignite.{sh|bat}
[13:25:02] 
[13:25:02] OS: Linux 4.15.0-43-generic amd64
[13:25:02] VM information: Java(TM) SE Runtime Environment 1.8.0_181-b13
Oracle Corporation Java HotSpot(TM) 64-Bit Server VM 25.181-b13
[13:25:02] Initial heap size is 126MB (should be no less than 512MB, use
-Xms512m -Xmx512m).
[13:25:02] Configured plugins:
[13:25:02]   ^-- None
[13:25:02] 
[13:25:02] Configured failure handler: [hnd=StopNodeOrHaltFailureHandler
[tryStop=false, timeout=0]]
[13:25:03] Message queue limit is set to 0 which may lead to potential OOMEs
when running cache operations in FULL_ASYNC or PRIMARY_SYNC modes due to
message queues growth on sender and receiver sides.
[13:25:03] Security status [authentication=off, tls/ssl=off]
[13:25:05] Performance suggestions for grid  (fix if possible)
[13:25:05] To disable, set -DIGNITE_PERFORMANCE_SUGGESTIONS_DISABLED=true
[13:25:05]   ^-- Disable grid events (remove 'includeEventTypes' from
configuration)
[13:25:05]   ^-- Enable G1 Garbage Collector (add '-XX:+UseG1GC' to JVM
options)
[13:25:05]   ^-- Specify JVM heap max size (add '-Xmx<size>[g|G|m|M|k|K]' to
JVM options)
[13:25:05]   ^-- Set max direct memory size if getting 'OOME: Direct buffer
memory' (add '-XX:MaxDirectMemorySize=<size>[g|G|m|M|k|K]' to JVM options)
[13:25:05]   ^-- Disable processing of calls to System.gc() (add
'-XX:+DisableExplicitGC' to JVM options)
[13:25:05] Refer to this page for more performance suggestions:
https://apacheignite.readme.io/docs/jvm-and-system-tuning
[13:25:05] 
[13:25:05] To start Console Management & Monitoring run
ignitevisorcmd.{sh|bat}
[13:25:05] 
[13:25:05] Ignite node started OK (id=ed98ec56)
[13:25:05] Topology snapshot [ver=1, servers=1, clients=0, CPUs=8,
offheap=1.6GB, heap=1.7GB]
[13:25:05]   ^-- Node [id=ED98EC56-39A8-478D-B4E4-3EBA6DB29840,
clusterState=ACTIVE]
[13:25:05] Data Regions Configured:
[13:25:05]   ^-- default [initSize=256.0 MiB, maxSize=1.6 GiB,
persistenceEnabled=false]

>>> Cache query DDL example started.

>>> Created database objects.

>>> Populated data.

>>> Query results:
>>>    [Mary Major, Forest Hill]
>>>    [Jane Roe, Denver]
>>>    [John Doe, St. Petersburg]
>>>    [Richard Miles, Denver]
[13:25:10] Ignite node stopped OK [uptime=00:00:04.244]
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.ignite.spark.IgniteContext$.apply$default$3()Z
	at
org.apache.ignite.spark.impl.IgniteRelationProvider.igniteContext(IgniteRelationProvider.scala:236)
	at
org.apache.ignite.spark.impl.IgniteRelationProvider.createRelation(IgniteRelationProvider.scala:62)
	at
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
	at
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
	at
org.apache.ignite.examples.sql.IgniteSparkSession.main(IgniteSparkSession.java:96)


this is my code also:

package org.apache.ignite.examples.sql;

import java.util.List;
import java.util.Scanner;
import java.lang.String.*;
import org.apache.ignite.Ignite;
import org.apache.ignite.IgniteCache;
import org.apache.ignite.Ignition;
import org.apache.ignite.cache.query.SqlFieldsQuery;
import org.apache.ignite.configuration.CacheConfiguration;
import org.apache.ignite.examples.ExampleNodeStartup;
import org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.functions.*;
import org.apache.ignite.spark.IgniteDataFrameSettings;
import org.h2.table.Column;
import org.apache.ignite.cache.query.SqlFieldsQuery;
import org.apache.ignite.configuration.CacheConfiguration;


public class IgniteSparkSession {
    /** Dummy cache name. */
    private static final String DUMMY_CACHE_NAME = "dummy_cache";
    private static String CONFIG = "examples/config/example-ignite.xml";

    /**
     * Executes example.
     *
     * @param args Command line arguments, none required.
     * @throws Exception If example execution failed.
     */
    @SuppressWarnings({"unused", "ThrowFromFinallyBlock"})
    public static void main(String[] args) throws Exception {

        try (Ignite ignite =
Ignition.start("examples/config/example-ignite.xml")) {
            print("Cache query DDL example started.");

            // Create dummy cache to act as an entry point for SQL queries
(new SQL API which do not require this
            // will appear in future versions, JDBC and ODBC drivers do not
require it already).
            CacheConfiguration<?, ?> cacheCfg = new
CacheConfiguration<>(DUMMY_CACHE_NAME).setSqlSchema("PUBLIC");

            try (
                    IgniteCache<?, ?> cache =
ignite.getOrCreateCache(cacheCfg)
            ) {
                // Create reference City table based on REPLICATED template.
                cache.query(new SqlFieldsQuery(
                        "CREATE TABLE city (id LONG PRIMARY KEY, name
VARCHAR) WITH \"template=replicated\"")).getAll();

                // Create table based on PARTITIONED template with one
backup.
                cache.query(new SqlFieldsQuery(
                        "CREATE TABLE person (id LONG, name VARCHAR, city_id
LONG, PRIMARY KEY (id, city_id)) " +
                                "WITH \"backups=1,
affinity_key=city_id\"")).getAll();

                // Create an index.
                cache.query(new SqlFieldsQuery("CREATE INDEX on Person
(city_id)")).getAll();

                print("Created database objects.");

                SqlFieldsQuery qry = new SqlFieldsQuery("INSERT INTO city
(id, name) VALUES (?, ?)");

                cache.query(qry.setArgs(1L, "Forest Hill")).getAll();
                cache.query(qry.setArgs(2L, "Denver")).getAll();
                cache.query(qry.setArgs(3L, "St. Petersburg")).getAll();

                qry = new SqlFieldsQuery("INSERT INTO person (id, name,
city_id) values (?, ?, ?)");

                cache.query(qry.setArgs(1L, "John Doe", 3L)).getAll();
                cache.query(qry.setArgs(2L, "Jane Roe", 2L)).getAll();
                cache.query(qry.setArgs(3L, "Mary Major", 1L)).getAll();
                cache.query(qry.setArgs(4L, "Richard Miles", 2L)).getAll();

                print("Populated data.");

                List<List&lt;?>> res = cache.query(new SqlFieldsQuery(
                        "SELECT p.name, c.name FROM Person p INNER JOIN City
c on c.id = p.city_id")).getAll();

                print("Query results:");

                for (Object next : res)
                    System.out.println(">>>    " + next);

                // Creating spark session.
                SparkSession spark = SparkSession.builder()
                        .appName("Example Program")
                        .master("local")
                        .config("spark.executor.instances", "2")
                        .getOrCreate();

                // spark read data from ignite

                Dataset<Row> df = spark.read()
                        .format(IgniteDataFrameSettings.FORMAT_IGNITE())             
//Data source
                        .option(IgniteDataFrameSettings.OPTION_TABLE(),
"person")     //Table to read.
                       
.option(IgniteDataFrameSettings.OPTION_CONFIG_FILE(), CONFIG) //Ignite
config.
                        .load();

                df.createOrReplaceTempView("person");

                Dataset<Row> igniteDF = spark.sql(
                        "SELECT * FROM person WHERE name = 'Mary Major'");
                df.show();

                cache.query(new SqlFieldsQuery("drop table
Person")).getAll();
                cache.query(new SqlFieldsQuery("drop table City")).getAll();

                print("Dropped database objects.");
            }
            finally {
                // Distributed cache can be removed from cluster only by
#destroyCache() call.
                //ignite.destroyCache(DUMMY_CACHE_NAME);
            }

            print("Cache query DDL example finished.");
            Scanner readUserInput=new Scanner(System.in);
            String myName=readUserInput.nextLine();
            ignite.close();
            System.out.println("Finished!");
        }
    }

    /**
     * Prints message.
     *
     * @param msg Message to print before all objects are printed.
     */
    private static void print(String msg) {
        System.out.println();
        System.out.println(">>> " + msg);
    }

}



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/