You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ignite.apache.org by "Andrey Aleksandrov (Jira)" <ji...@apache.org> on 2019/10/21 11:22:00 UTC
[jira] [Comment Edited] (IGNITE-11724) IgniteSpark integration
forget to close the IgniteContext and stops the client node in case if
error during PairFunction logic
[ https://issues.apache.org/jira/browse/IGNITE-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16955967#comment-16955967 ]
Andrey Aleksandrov edited comment on IGNITE-11724 at 10/21/19 11:21 AM:
------------------------------------------------------------------------
[~nizhikov] sorry, my example wasn't full. {{IgniteContext#close doesn't help in this case. Please take a look at the updated example and logs.}}
{{After getting IllegalStateException("some error") current job will not be stopped at all and Ignite node started in spark will continue working regardless of }}{{IgniteContext#close method call.}}{{}}
{{Possible it can be fixed more accurate but the current fix can help with this case.}}
{{BR,}}
{{Andrei}}
was (Author: aealeksandrov):
[~nizhikov] sorry, my example wasn't full. {{IgniteContext#close doesn't help in this case. Please take a look at the updated example and logs.}}
{{After getting IllegalStateException("some error") current job will not be stopped at all.}}
{{Possible it can be fixed more accurate but the current fix can help with this case.}}
{{BR,}}
{{Andrei}}
> IgniteSpark integration forget to close the IgniteContext and stops the client node in case if error during PairFunction logic
> -------------------------------------------------------------------------------------------------------------------------------
>
> Key: IGNITE-11724
> URL: https://issues.apache.org/jira/browse/IGNITE-11724
> Project: Ignite
> Issue Type: Bug
> Components: spark
> Affects Versions: 2.8
> Reporter: Andrey Aleksandrov
> Assignee: Alexey Zinoviev
> Priority: Major
> Labels: await
> Fix For: 2.8
>
> Attachments: logs.txt
>
> Time Spent: 20m
> Remaining Estimate: 0h
>
> Next code could hang in case if PairFunction logic will throw the exception:
> {code:java}
> public class Example {
> public static void main(String[] args) {
> String configPath = "/home/andrei/BDP/big-data-accelerator/modules/gridgain-spark-loader-examples/config/client.xml";
> IgniteSparkSession igniteSession = IgniteSparkSession.builder()
> .appName("Spark Ignite catalog example")
> .master("local")
> .config("ignite.disableSparkSQLOptimization", true)
> .igniteConfig(configPath)
> .getOrCreate();
> JavaSparkContext sparkCtx = new JavaSparkContext(igniteSession.sparkContext());
> final JavaRDD<Row> records = sparkCtx.parallelize(Arrays.asList(
> new GenericRow()
> ));
> JavaPairRDD<Integer, Integer> rdd_records = records.mapToPair(new PairFunction<Row, Integer, Integer>() {
> @Override
> public Tuple2<Integer, Integer> call(Row row) throws Exception {
> throw new IllegalStateException("some error");
> }
> });
> JavaIgniteContext<Integer, Integer> igniteContext = new JavaIgniteContext<>(sparkCtx, configPath);
> JavaIgniteRDD<Integer, Integer> igniteRdd = igniteContext.<Integer, Integer>fromCache("Person");
> igniteRdd.savePairs(rdd_records);
> igniteContext.close(true);
> }
> }
> Looks like next internal code (saveValues method)should also close the IgniteContext in case of an unexpected exception, not only data streamer:
> try {
> it.foreach(value ⇒
> { val key = affinityKeyFunc(value, node.orNull) streamer.addData(key, value) }
> )
> }
> finally
> { streamer.close() }
> })
> }
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)