You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/03/15 11:38:48 UTC
[spark] branch branch-3.4 updated: [MINOR][PYTHON] Change TypeVar to private symbols
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new fb729ad1221 [MINOR][PYTHON] Change TypeVar to private symbols
fb729ad1221 is described below
commit fb729ad12216a9ffa872b7b91ecb8ddf3c6b8229
Author: Maico Timmerman <ma...@gmail.com>
AuthorDate: Wed Mar 15 20:38:11 2023 +0900
[MINOR][PYTHON] Change TypeVar to private symbols
### What changes were proposed in this pull request?
I've convert internal typing symbols in `__init__.py` to private. When these changes are agreed upon, I can expand this MR with more `TypeVar` this applies to.
### Why are the changes needed?
Editors consider all public symbols in a library importable. A common pattern is to use a shorthand for pyspark functions:
```python
import pyspark.sql.functions as F
F.col(...)
```
Since `pyspark.F` is a valid symbol according to `__init__.py`, editors will suggest this to users, while it is not a valid use-case of pyspark.
This change is in line with Pyright's [Typing Guidance for Python Libraries](https://github.com/microsoft/pyright/blob/main/docs/typed-libraries.md#generic-classes-and-functions)
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Verified with PyCharm auto-importing.
Closes #40338 from MaicoTimmerman/master.
Authored-by: Maico Timmerman <ma...@gmail.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit 6d3587ad5ba676b0c82a2c75ccd00c370b592563)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
python/pyspark/__init__.py | 17 ++++++++---------
1 file changed, 8 insertions(+), 9 deletions(-)
diff --git a/python/pyspark/__init__.py b/python/pyspark/__init__.py
index 80f25f39bad..b8bca7776dd 100644
--- a/python/pyspark/__init__.py
+++ b/python/pyspark/__init__.py
@@ -69,11 +69,10 @@ from pyspark.profiler import Profiler, BasicProfiler
from pyspark.version import __version__
from pyspark._globals import _NoValue # noqa: F401
-T = TypeVar("T")
-F = TypeVar("F", bound=Callable)
+_F = TypeVar("_F", bound=Callable)
-def since(version: Union[str, float]) -> Callable[[F], F]:
+def since(version: Union[str, float]) -> Callable[[_F], _F]:
"""
A decorator that annotates a function to append the version of Spark the function was added.
"""
@@ -81,7 +80,7 @@ def since(version: Union[str, float]) -> Callable[[F], F]:
indent_p = re.compile(r"\n( +)")
- def deco(f: F) -> F:
+ def deco(f: _F) -> _F:
assert f.__doc__ is not None
indents = indent_p.findall(f.__doc__)
@@ -93,11 +92,11 @@ def since(version: Union[str, float]) -> Callable[[F], F]:
def copy_func(
- f: F,
+ f: _F,
name: Optional[str] = None,
sinceversion: Optional[Union[str, float]] = None,
doc: Optional[str] = None,
-) -> F:
+) -> _F:
"""
Returns a function with same code, globals, defaults, closure, and
name (or provide a new name).
@@ -119,10 +118,10 @@ def copy_func(
fn.__doc__ = doc
if sinceversion is not None:
fn = since(sinceversion)(fn)
- return cast(F, fn)
+ return cast(_F, fn)
-def keyword_only(func: F) -> F:
+def keyword_only(func: _F) -> _F:
"""
A decorator that forces keyword arguments in the wrapped method
and saves actual input keyword arguments in `_input_kwargs`.
@@ -139,7 +138,7 @@ def keyword_only(func: F) -> F:
self._input_kwargs = kwargs
return func(self, **kwargs)
- return cast(F, wrapper)
+ return cast(_F, wrapper)
# To avoid circular dependencies
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org