You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/16 16:25:00 UTC

[GitHub] [spark] gaborgsomogyi commented on a change in pull request #24382: [SPARK-27330][SS] support task abort in foreach writer

gaborgsomogyi commented on a change in pull request #24382: [SPARK-27330][SS] support task abort in foreach writer
URL: https://github.com/apache/spark/pull/24382#discussion_r275868759
 
 

 ##########
 File path: docs/structured-streaming-programming-guide.md
 ##########
 @@ -2240,6 +2248,8 @@ When the streaming query is started, Spark calls the function or the object’s
 
       - Method close(error) is called with error (if any) seen while processing rows.
 
+      - Method abort is called after closed with error is being called or after an unexpected error (such as on task interruption)
 
 Review comment:
   This sentence is weird, maybe: `Method abort is called either after close with error is called or...`
   One thing is missing what I would like to mention additionally: if `close` throws exception. Mentioning only `closed with error` generates the impression only this case called.
   
   The other option what I see is to write any exception during processing (except if open throws).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org