You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Nazario (JIRA)" <ji...@apache.org> on 2015/06/02 16:55:17 UTC
[jira] [Commented] (SPARK-7899) PySpark sql/tests breaks pylint
validation
[ https://issues.apache.org/jira/browse/SPARK-7899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14569208#comment-14569208 ]
Michael Nazario commented on SPARK-7899:
----------------------------------------
[~davies] Could we get this backported to Spark 1.4?
> PySpark sql/tests breaks pylint validation
> ------------------------------------------
>
> Key: SPARK-7899
> URL: https://issues.apache.org/jira/browse/SPARK-7899
> Project: Spark
> Issue Type: Bug
> Components: PySpark, Tests
> Affects Versions: 1.4.0
> Reporter: Michael Nazario
> Assignee: Michael Nazario
> Fix For: 1.5.0
>
>
> The pyspark.sql.types module is dynamically named {{types}} from {{_types}} which messes up pylint validation
> From [~justin.uang] below:
> In commit 04e44b37, the migration to Python 3, {{pyspark/sql/types.py}} was renamed to {{pyspark/sql/\_types.py}} and then some magic in {{pyspark/sql/\_\_init\_\_.py}} dynamically renamed the module back to {{types}}. I imagine that this is some naming conflict with Python 3, but what was the error that showed up?
> The reason why I'm asking about this is because it's messing with pylint, since pylint cannot now statically find the module. I tried also importing the package so that {{\_\_init\_\_}} would be run in a init-hook, but that isn't what the discovery mechanism is using. I imagine it's probably just crawling the directory structure.
> One way to work around this would be something akin to this (http://stackoverflow.com/questions/9602811/how-to-tell-pylint-to-ignore-certain-imports), where I would have to create a fake module, but I would probably be missing a ton of pylint features on users of that module, and it's pretty hacky.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org