You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2022/04/16 18:59:00 UTC
[jira] [Resolved] (SPARK-38660) PySpark DeprecationWarning: distutils Version classes are deprecated
[ https://issues.apache.org/jira/browse/SPARK-38660?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean R. Owen resolved SPARK-38660.
----------------------------------
Fix Version/s: 3.4.0
Resolution: Fixed
Issue resolved by pull request 35977
[https://github.com/apache/spark/pull/35977]
> PySpark DeprecationWarning: distutils Version classes are deprecated
> --------------------------------------------------------------------
>
> Key: SPARK-38660
> URL: https://issues.apache.org/jira/browse/SPARK-38660
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 3.2.1
> Reporter: Gergely Kalmar
> Priority: Major
> Fix For: 3.4.0
>
>
> When executing spark.read.csv(f'\{gcs_bucket}/\{data_file}', inferSchema=True, header=True) I'm getting the following warning:
> {noformat}
> .../lib/python3.8/site-packages/pyspark/sql/pandas/conversion.py:62: in toPandas
> require_minimum_pandas_version()
> .../lib/python3.8/site-packages/pyspark/sql/pandas/utils.py:35: in require_minimum_pandas_version
> if LooseVersion(pandas.__version__) < LooseVersion(minimum_pandas_version):
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> self = <[AttributeError("'LooseVersion' object has no attribute 'vstring'") raised in repr()] LooseVersion object at 0x7f2319fc0f70>, vstring = '1.4.1'
> def __init__ (self, vstring=None):
> > warnings.warn(
> "distutils Version classes are deprecated. "
> "Use packaging.version instead.",
> DeprecationWarning,
> stacklevel=2,
> )
> E DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
> .../lib/python3.8/site-packages/setuptools/_distutils/version.py:53: DeprecationWarning
> {noformat}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org