You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Stavros Kontopoulos (JIRA)" <ji...@apache.org> on 2019/07/19 11:52:00 UTC
[jira] [Created] (SPARK-28445) Inconsistency between Scala and
Python/Panda udfs when groupby udef() is used
Stavros Kontopoulos created SPARK-28445:
-------------------------------------------
Summary: Inconsistency between Scala and Python/Panda udfs when groupby udef() is used
Key: SPARK-28445
URL: https://issues.apache.org/jira/browse/SPARK-28445
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.0.0
Reporter: Stavros Kontopoulos
Python:
from pyspark.sql.functions import pandas_udf, PandasUDFType
@pandas_udf("int", PandasUDFType.SCALAR)
def noop(x):
return x
spark.udf.register("udf", noop)
sql("""
CREATE OR REPLACE TEMPORARY VIEW testData AS SELECT * FROM VALUES
(1, 1), (1, 2), (2, 1), (2, 2), (3, 1), (3, 2), (null, 1), (3, null), (null, null)
AS testData(a, b)""")
sql("""SELECT udf(a + 1), udf(COUNT(b)) FROM testData GROUP BY udf(a + 1)""").show()
: org.apache.spark.sql.AnalysisException: expression 'testdata.`a`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;;
Aggregate [udf((a#0 + 1))], [udf((a#0 + 1)) AS udf((a + 1))#10, udf(count(b#1)) AS udf(count(b))#12]
+- SubqueryAlias `testdata`
+- Project [a#0, b#1]
+- SubqueryAlias `testData`
+- LocalRelation [a#0, b#1]
Scala:
spark.udf.register("udf", (input: Int) => input)
sql("""
CREATE OR REPLACE TEMPORARY VIEW testData AS SELECT * FROM VALUES
(1, 1), (1, 2), (2, 1), (2, 2), (3, 1), (3, 2), (null, 1), (3, null), (null, null)
AS testData(a, b)""")
sql("""SELECT udf(a + 1), udf(COUNT(b)) FROM testData GROUP BY udf(a + 1)""").show()
+------------+-------------+
|udf((a + 1))|udf(count(b))|
+------------+-------------+
| null| 1|
| 3| 2|
| 4| 2|
| 2| 2|
+------------+-------------+
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org