You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sandeep Singh (Jira)" <ji...@apache.org> on 2023/01/03 02:38:00 UTC
[jira] [Created] (SPARK-41846) DataFrame aggregation functions : unresolved columns
Sandeep Singh created SPARK-41846:
-------------------------------------
Summary: DataFrame aggregation functions : unresolved columns
Key: SPARK-41846
URL: https://issues.apache.org/jira/browse/SPARK-41846
Project: Spark
Issue Type: Sub-task
Components: Connect
Affects Versions: 3.4.0
Reporter: Sandeep Singh
{code}
File "/.../spark/python/pyspark/sql/connect/column.py", line 106, in pyspark.sql.connect.column.Column.eqNullSafe
Failed example:
df1.join(df2, df1["value"] == df2["value"]).count()
Exception raised:
Traceback (most recent call last):
File "/.../miniconda3/envs/python3.9/lib/python3.9/doctest.py", line 1336, in __run
exec(compile(example.source, filename, "single",
File "<doctest pyspark.sql.connect.column.Column.eqNullSafe[4]>", line 1, in <module>
df1.join(df2, df1["value"] == df2["value"]).count()
File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 151, in count
pdd = self.agg(_invoke_function("count", lit(1))).toPandas()
File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 1031, in toPandas
return self._session.client.to_pandas(query)
File "/.../spark/python/pyspark/sql/connect/client.py", line 413, in to_pandas
return self._execute_and_fetch(req)
File "/.../spark/python/pyspark/sql/connect/client.py", line 573, in _execute_and_fetch
self._handle_error(rpc_error)
File "/.../spark/python/pyspark/sql/connect/client.py", line 619, in _handle_error
raise SparkConnectAnalysisException(
pyspark.sql.connect.client.SparkConnectAnalysisException: [AMBIGUOUS_REFERENCE] Reference `value` is ambiguous, could be: [`value`, `value`].
{code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org