You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2015/12/01 01:17:11 UTC
[jira] [Updated] (SPARK-12061) [SQL] Dataset API: Adding Persist
for Map/filter with Lambda Functions
[ https://issues.apache.org/jira/browse/SPARK-12061?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiao Li updated SPARK-12061:
----------------------------
Description:
So far, the existing caching mechanisms do not work on dataset operations when using map/filter with lambda functions. For example,
[code]
test("persist and then map/filter with lambda functions") {
val f = (i: Int) => i + 1
val ds = Seq(1, 2, 3).toDS()
val mapped = ds.map(f)
mapped.cache()
val mapped2 = ds.map(f)
assertCached(mapped2)
}
[code]
was:
So far, the existing caching mechanisms do not work on dataset operations when using map/filter with lambda functions. For example,
test("persist and then map/filter with lambda functions") {
val f = (i: Int) => i + 1
val ds = Seq(1, 2, 3).toDS()
val mapped = ds.map(f)
mapped.cache()
val mapped2 = ds.map(f)
assertCached(mapped2)
}
> [SQL] Dataset API: Adding Persist for Map/filter with Lambda Functions
> ----------------------------------------------------------------------
>
> Key: SPARK-12061
> URL: https://issues.apache.org/jira/browse/SPARK-12061
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: Xiao Li
>
> So far, the existing caching mechanisms do not work on dataset operations when using map/filter with lambda functions. For example,
> [code]
> test("persist and then map/filter with lambda functions") {
> val f = (i: Int) => i + 1
> val ds = Seq(1, 2, 3).toDS()
> val mapped = ds.map(f)
> mapped.cache()
> val mapped2 = ds.map(f)
> assertCached(mapped2)
> }
> [code]
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org