You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/04/06 01:04:00 UTC
[jira] [Commented] (SPARK-23878) unable to import col() or lit()
[ https://issues.apache.org/jira/browse/SPARK-23878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16427812#comment-16427812 ]
Hyukjin Kwon commented on SPARK-23878:
--------------------------------------
So, just to be clear, it works fine but IDE doesn't detect them since they were defined dynamically?
> unable to import col() or lit()
> -------------------------------
>
> Key: SPARK-23878
> URL: https://issues.apache.org/jira/browse/SPARK-23878
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.3.0
> Environment: eclipse 4.7.3
> pyDev 6.3.2
> pyspark==2.3.0
> Reporter: Andrew Davidson
> Priority: Major
>
> I have some code I am moving from a jupyter notebook to separate python modules. My notebook uses col() and list() and works fine
> when I try to work with module files in my IDE I get the following errors. I am also not able to run my unit tests.
> {color:#FF0000}Description Resource Path Location Type{color}
> {color:#FF0000}Unresolved import: lit load.py /adt_pyDevProj/src/automatedDataTranslation line 22 PyDev Problem{color}
> {color:#FF0000}Description Resource Path Location Type{color}
> {color:#FF0000}Unresolved import: col load.py /adt_pyDevProj/src/automatedDataTranslation line 21 PyDev Problem{color}
> I suspect that when you run pyspark it is generating the col and lit functions?
> I found a discription of the problem @ [https://stackoverflow.com/questions/40163106/cannot-find-col-function-in-pyspark] I do not understand how to make this work in my IDE. I am not running pyspark just an editor
> is there some sort of workaround or replacement for these missing functions?
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org