You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maciej Szymkiewicz (Jira)" <ji...@apache.org> on 2021/10/12 10:14:00 UTC

[jira] [Created] (SPARK-36985) Future typing errors in pyspark.pandas

Maciej Szymkiewicz created SPARK-36985:
------------------------------------------

             Summary: Future typing errors in pyspark.pandas
                 Key: SPARK-36985
                 URL: https://issues.apache.org/jira/browse/SPARK-36985
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 3.3.0
            Reporter: Maciej Szymkiewicz


The following problems are detected on master with mypy 0.920
{code:java}
$ git rev-parse HEAD             
36b3bbc0aa9f9c39677960cd93f32988c7d7aaca
$ mypy --version                 
mypy 0.920+dev.332b712df848cd242987864b38bd237364654532
$ mypy --config-file mypy.ini pyspark
pyspark/pandas/indexes/base.py:184: error: Incompatible types in assignment (expression has type "CategoricalIndex", variable has type "MultiIndex")  [assignment]
pyspark/pandas/indexes/base.py:188: error: Incompatible types in assignment (expression has type "Int64Index", variable has type "MultiIndex")  [assignment]
pyspark/pandas/indexes/base.py:192: error: Incompatible types in assignment (expression has type "Float64Index", variable has type "MultiIndex")  [assignment]
pyspark/pandas/indexes/base.py:197: error: Incompatible types in assignment (expression has type "DatetimeIndex", variable has type "MultiIndex")  [assignment]
pyspark/pandas/indexes/base.py:199: error: Incompatible types in assignment (expression has type "Index", variable has type "MultiIndex")  [assignment]
pyspark/pandas/indexes/base.py:201: error: "MultiIndex" has no attribute "_anchor"  [attr-defined]
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org