You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kmat <ku...@hotmail.com> on 2016/06/24 23:06:57 UTC

Spark 2.0 Continuous Processing

Is there a way to checkpoint sink(s) to facilitate rewind processing from a
specific offset. 
For example a continuous query aggregated by month. 
On the 10 month would like to re-compute information between 4th to 8th
months.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-0-Continuous-Processing-tp27226.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: Spark 2.0 Continuous Processing

Posted by kmat <ku...@hotmail.com>.
In a continuous processing pipeline with dataframes is there any way to checkpoint the processing state (by the user) at periodic intervals. The thought process behind this is to rewind to any particular checkpoint and then fast forward processing thereon.
Date: Wed, 29 Jun 2016 23:17:47 -0700
From: ml-node+s1001560n27250h41@n3.nabble.com
To: kurianmathewp@hotmail.com
Subject: Re: Spark 2.0 Continuous Processing



	 it also supports interactive and batch queries. With Spark 2.0, DataFrames and Datasets are being combined and which ensures that an event or file is processed once, and only once.Writing resources are guaranteed that grammar and spelling all will remain accurate and correct.

	
	
	
	

	

	
	
		If you reply to this email, your message will be added to the discussion below:
		http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-0-Continuous-Processing-tp27226p27250.html
	
	
		
		To unsubscribe from Spark 2.0 Continuous Processing, click here.

		NAML
	 		 	   		  



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-0-Continuous-Processing-tp27226p27253.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.