You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "John Hany (Jira)" <ji...@apache.org> on 2021/03/19 16:13:00 UTC

[jira] [Created] (SPARK-34803) Util methods requiring certain versions of Pandas & PyArrow don't pass through the raised ImportError

John Hany created SPARK-34803:
---------------------------------

             Summary: Util methods requiring certain versions of Pandas & PyArrow don't pass through the raised ImportError
                 Key: SPARK-34803
                 URL: https://issues.apache.org/jira/browse/SPARK-34803
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 3.1.1
            Reporter: John Hany


When checking that the we can import either {{pandas}} or {{pyarrow}}, we except any {{ImportError}} and raise an error declaring the minimum version of the respective package that's required to be in the Python environment.

We don't however, pass the {{ImportError}} that might have been thrown by the package itself. Take {{pandas}} as an example, when we call {{import pandas}}, pandas itself might be in the environment, but can throw an {{ImportError}} [https://github.com/pandas-dev/pandas/blob/0.24.x/pandas/compat/__init__.py#L438] if another package it requires isn't there. This error wouldn't be passed through and we'd end up getting a misleading error message that states that {{pandas}} isn't in the environment, while in fact it is but something else makes us unable to import it.

I believe this can be improved by chaining the exceptions and am happy to provide said contribution.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org