You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jacek Laskowski (JIRA)" <ji...@apache.org> on 2016/02/07 17:15:39 UTC
[jira] [Updated] (SPARK-13229) When checkpoint interval for
ConstantInputDStream is lower than batch duration IllegalArgumentException
says it is due to slide time instead
[ https://issues.apache.org/jira/browse/SPARK-13229?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jacek Laskowski updated SPARK-13229:
------------------------------------
Priority: Minor (was: Major)
Issue Type: Improvement (was: Bug)
> When checkpoint interval for ConstantInputDStream is lower than batch duration IllegalArgumentException says it is due to slide time instead
> --------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-13229
> URL: https://issues.apache.org/jira/browse/SPARK-13229
> Project: Spark
> Issue Type: Improvement
> Components: Streaming
> Affects Versions: 2.0.0
> Reporter: Jacek Laskowski
> Priority: Minor
>
> I have not set the slide time so the requirement failure is not meaningful at all to me (and the end user is left confused):
> {code}
> java.lang.IllegalArgumentException: requirement failed: The checkpoint interval for ConstantInputDStream has been set to 1000 ms which is lower than its slide time (5000 ms). Please set it to at least 5000 ms.
> {code}
> Here is the code to reproduce:
> {code}
> val sc = new SparkContext("local[*]", "Constant Input DStream Demo", new SparkConf())
> val ssc = new StreamingContext(sc, batchDuration = Seconds(5))
> ssc.checkpoint("_checkpoint")
> import org.apache.spark.streaming.dstream.ConstantInputDStream
> val rdd = sc.parallelize(0 to 9)
> val cis = new ConstantInputDStream(ssc, rdd)
> cis.checkpoint(interval = Seconds(1))
> cis.print
> ssc.start
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org