You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/06/24 04:55:16 UTC
[jira] [Assigned] (SPARK-16181) Incorrect behavior for isNull
filter
[ https://issues.apache.org/jira/browse/SPARK-16181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-16181:
------------------------------------
Assignee: (was: Apache Spark)
> Incorrect behavior for isNull filter
> ------------------------------------
>
> Key: SPARK-16181
> URL: https://issues.apache.org/jira/browse/SPARK-16181
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Kevin Chen
>
> Repro:
> JavaRDD<Row> leftRdd = javaSparkContext.parallelize(ImmutableList.of(RowFactory.create("x")));
> JavaRDD<Row> rightRdd = javaSparkContext.parallelize(ImmutableList.of(RowFactory.create("y")));
> StructType schema = DataTypes.createStructType(ImmutableList.of(
> DataTypes.createStructField("col", DataTypes.StringType, true)));
> Dataset<Row> left = sparkSession.createDataFrame(leftRdd, schema);
> Dataset<Row> right = sparkSession.createDataFrame(rightRdd, schema);
> // add a column to the right
> Dataset<Row> withConstantColumn = right.withColumn("new", functions.lit(true));
> // do a left join. Nothing matches; expect Dataset joined to have a single row ['x', null, null]
> Column joinCondition = left.col("col").equalTo(right.col("col"));
> Dataset<Row> joined = left.join(withConstantColumn, joinCondition, LeftOuter.toString());
> // filter for nulls, still expect the single row ['x', null, null]
> Dataset<Row> filtered = joined.filter(functions.col("new").isNull());
> // This fails with 1 != 0
> Assert.assertEquals(1, filtered.count());
> [~rxin]
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org