You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Haejoon Lee (Jira)" <ji...@apache.org> on 2023/03/03 20:41:00 UTC
[jira] [Created] (SPARK-42666) Fix `tail` to work properly
Haejoon Lee created SPARK-42666:
-----------------------------------
Summary: Fix `tail` to work properly
Key: SPARK-42666
URL: https://issues.apache.org/jira/browse/SPARK-42666
Project: Spark
Issue Type: Sub-task
Components: Connect
Affects Versions: 3.5.0
Reporter: Haejoon Lee
The code below is not working properly in Spark Connect:
{code:java}
>>> sdf = spark.range(10)
>>> spark.createDataFrame(sdf.tail(5), sdf.schema)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 94, in __repr__
return "DataFrame[%s]" % (", ".join("%s: %s" % c for c in self.dtypes))
File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 162, in dtypes
return [(str(f.name), f.dataType.simpleString()) for f in self.schema.fields]
File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 1346, in schema
self._schema = self._session.client.schema(query)
File "/.../spark/python/pyspark/sql/connect/client.py", line 614, in schema
proto_schema = self._analyze(method="schema", plan=plan).schema
File "/.../spark/python/pyspark/sql/connect/client.py", line 755, in _analyze
self._handle_error(rpc_error)
File "/.../spark/python/pyspark/sql/connect/client.py", line 894, in _handle_error
raise convert_exception(info, status.message) from None
pyspark.errors.exceptions.connect.AnalysisException: [NULLABLE_COLUMN_OR_FIELD] Column or field `id` is nullable while it's required to be non-nullable.{code}
whereas working properly in regular PySpark:
{code:java}
>>> sdf = spark.range(10)
>>> spark.createDataFrame(sdf.tail(5), sdf.schema).show()
+---+
| id|
+---+
| 5|
| 6|
| 7|
| 8|
| 9|
+---+ {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org