You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2017/06/01 23:25:04 UTC

[jira] [Updated] (SPARK-20894) Error while checkpointing to HDFS

     [ https://issues.apache.org/jira/browse/SPARK-20894?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Shixiong Zhu updated SPARK-20894:
---------------------------------
    Fix Version/s: 2.3.0

> Error while checkpointing to HDFS
> ---------------------------------
>
>                 Key: SPARK-20894
>                 URL: https://issues.apache.org/jira/browse/SPARK-20894
>             Project: Spark
>          Issue Type: Improvement
>          Components: Structured Streaming
>    Affects Versions: 2.1.1
>         Environment: Ubuntu, Spark 2.1.1, hadoop 2.7
>            Reporter: kant kodali
>            Assignee: Shixiong Zhu
>             Fix For: 2.3.0
>
>         Attachments: driver_info_log, executor1_log, executor2_log
>
>
> Dataset<Row> df2 = df1.groupBy(functions.window(df1.col("Timestamp5"), "24 hours", "24 hours"), df1.col("AppName")).count();
> StreamingQuery query = df2.writeStream().foreach(new KafkaSink()).option("checkpointLocation","/usr/local/hadoop/checkpoint").outputMode("update").start();
> query.awaitTermination();
> This for some reason fails with the Error 
> ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
> java.lang.IllegalStateException: Error reading delta file /usr/local/hadoop/checkpoint/state/0/0/1.delta of HDFSStateStoreProvider[id = (op=0, part=0), dir = /usr/local/hadoop/checkpoint/state/0/0]: /usr/local/hadoop/checkpoint/state/0/0/1.delta does not exist
> I did clear all the checkpoint data in /usr/local/hadoop/checkpoint/  and all consumer offsets in Kafka from all brokers prior to running and yet this error still persists. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org