You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/06/28 17:30:57 UTC
[jira] [Resolved] (SPARK-16181) Incorrect behavior for isNull
filter
[ https://issues.apache.org/jira/browse/SPARK-16181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yin Huai resolved SPARK-16181.
------------------------------
Resolution: Fixed
Assignee: Wenchen Fan
Fix Version/s: 2.0.0
This issue has been resolved by https://github.com/apache/spark/pull/13884.
> Incorrect behavior for isNull filter
> ------------------------------------
>
> Key: SPARK-16181
> URL: https://issues.apache.org/jira/browse/SPARK-16181
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Kevin Chen
> Assignee: Wenchen Fan
> Fix For: 2.0.0
>
>
> Repro:
> JavaRDD<Row> leftRdd = javaSparkContext.parallelize(ImmutableList.of(RowFactory.create("x")));
> JavaRDD<Row> rightRdd = javaSparkContext.parallelize(ImmutableList.of(RowFactory.create("y")));
> StructType schema = DataTypes.createStructType(ImmutableList.of(
> DataTypes.createStructField("col", DataTypes.StringType, true)));
> Dataset<Row> left = sparkSession.createDataFrame(leftRdd, schema);
> Dataset<Row> right = sparkSession.createDataFrame(rightRdd, schema);
> // add a column to the right
> Dataset<Row> withConstantColumn = right.withColumn("new", functions.lit(true));
> // do a left join. Nothing matches; expect Dataset joined to have a single row ['x', null, null]
> Column joinCondition = left.col("col").equalTo(right.col("col"));
> Dataset<Row> joined = left.join(withConstantColumn, joinCondition, LeftOuter.toString());
> // filter for nulls, still expect the single row ['x', null, null]
> Dataset<Row> filtered = joined.filter(functions.col("new").isNull());
> // This fails with 1 != 0
> Assert.assertEquals(1, filtered.count());
> [~rxin]
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org