You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2015/07/30 06:27:04 UTC

[jira] [Updated] (SPARK-9192) add initialization phase for nondeterministic expression in code generation

     [ https://issues.apache.org/jira/browse/SPARK-9192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Reynold Xin updated SPARK-9192:
-------------------------------
    Summary: add initialization phase for nondeterministic expression in code generation  (was: add initialization phase for nondeterministic expression)

> add initialization phase for nondeterministic expression in code generation
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-9192
>                 URL: https://issues.apache.org/jira/browse/SPARK-9192
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Wenchen Fan
>            Assignee: Wenchen Fan
>             Fix For: 1.5.0
>
>
> Currently nondeterministic expression is broken without a explicit initialization phase.
> Let me take `MonotonicallyIncreasingID` as an example. This expression need a mutable state to remember how many times it has been evaluated, so we use `@transient var count: Long` there. By being transient, the `count` will be reset to 0 and **only** to 0 when serialize and deserialize it, as deserialize transient variable will result to default value. There is *no way* to use another initial value for `count`, until we add the explicit initialization phase.
> For now no nondeterministic expression need this feature, but we may add new ones with the need of a different initial value for mutable state in the future.
> Another use case is local execution for LocalRelation, there is no serialize and deserialize phase and thus we can't reset mutable states for it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org