You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2021/07/16 12:42:33 UTC
[spark] branch branch-3.2 updated: [SPARK-36160][PYTHON][DOCS]
Clarifying documentation for pyspark sql/column
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.2 by this push:
new d69861a [SPARK-36160][PYTHON][DOCS] Clarifying documentation for pyspark sql/column
d69861a is described below
commit d69861ab6fd25d6a2407d925c15328d851ebc05f
Author: Dominik Gehl <do...@open.ch>
AuthorDate: Fri Jul 16 21:32:53 2021 +0900
[SPARK-36160][PYTHON][DOCS] Clarifying documentation for pyspark sql/column
Adapting documentation of `between`, `getField`, `dropFields` and `cast` to the corresponding scala doc
Documentation clarity
No.
Only documentation change
Closes #33369 from dominikgehl/feature/SPARK-36160.
Authored-by: Dominik Gehl <do...@open.ch>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit 2d8d7b4aae224ee8f8cd3cea10ab5ee509b5ed2b)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
python/pyspark/sql/column.py | 9 +++++----
1 file changed, 5 insertions(+), 4 deletions(-)
diff --git a/python/pyspark/sql/column.py b/python/pyspark/sql/column.py
index 391ee5e..9046e7f 100644
--- a/python/pyspark/sql/column.py
+++ b/python/pyspark/sql/column.py
@@ -329,7 +329,7 @@ class Column(object):
def getField(self, name):
"""
- An expression that gets a field by name in a StructField.
+ An expression that gets a field by name in a :class:`StructType`.
.. versionadded:: 1.3.0
@@ -394,6 +394,7 @@ class Column(object):
def dropFields(self, *fieldNames):
"""
An expression that drops fields in :class:`StructType` by name.
+ This is a no-op if schema doesn't contain field name(s).
.. versionadded:: 3.1.0
@@ -757,7 +758,8 @@ class Column(object):
name = copy_func(alias, sinceversion=2.0, doc=":func:`name` is an alias for :func:`alias`.")
def cast(self, dataType):
- """ Convert the column into type ``dataType``.
+ """
+ Casts the column into type ``dataType``.
.. versionadded:: 1.3.0
@@ -783,8 +785,7 @@ class Column(object):
def between(self, lowerBound, upperBound):
"""
- A boolean expression that is evaluated to true if the value of this
- expression is between the given columns.
+ True if the current column is between the lower bound and upper bound, inclusive.
.. versionadded:: 1.3.0
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org