You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/07/12 16:22:00 UTC
[jira] [Assigned] (SPARK-28199) Move Trigger implementations to
Triggers.scala and avoid exposing these to the end users
[ https://issues.apache.org/jira/browse/SPARK-28199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen reassigned SPARK-28199:
---------------------------------
Docs Text: In Spark 3.0, the deprecated class org.apache.spark.sql.streaming.ProcessingTime has been removed. Use org.apache.spark.sql.streaming.Trigger.ProcessingTime() instead. Likewise, org.apache.spark.sql.execution.streaming.continuous.ContinuousTrigger has been removed in favor of Trigger.Continuous(), and org.apache.spark.sql.execution.streaming.OneTimeTrigger has been hidden in favor of Trigger.Once().
Assignee: Jungtaek Lim
> Move Trigger implementations to Triggers.scala and avoid exposing these to the end users
> ----------------------------------------------------------------------------------------
>
> Key: SPARK-28199
> URL: https://issues.apache.org/jira/browse/SPARK-28199
> Project: Spark
> Issue Type: Improvement
> Components: Structured Streaming
> Affects Versions: 3.0.0
> Reporter: Jungtaek Lim
> Assignee: Jungtaek Lim
> Priority: Minor
> Labels: release-notes
>
> EvenĀ ProcessingTime is deprecated in 2.2.0, it's being used in Spark codebase, and actually the alternative Spark proposes use deprecated methods which feels like circular - never be able to remove usage.
> In fact, ProcessingTime is deprecated because we want to only expose Trigger.xxx instead of exposing actual implementations, and I think we miss some other implementations as well.
> This issue targets to move all Trigger implementations to Triggers.scala, and hide from end users.
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org