You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dat Tran (JIRA)" <ji...@apache.org> on 2016/01/11 15:55:39 UTC
[jira] [Updated] (SPARK-12753) Import error during unit test while
calling a function from reduceByKey()
[ https://issues.apache.org/jira/browse/SPARK-12753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dat Tran updated SPARK-12753:
-----------------------------
Attachment: map.py
> Import error during unit test while calling a function from reduceByKey()
> -------------------------------------------------------------------------
>
> Key: SPARK-12753
> URL: https://issues.apache.org/jira/browse/SPARK-12753
> Project: Spark
> Issue Type: Question
> Components: PySpark
> Affects Versions: 1.6.0
> Environment: El Capitan, Single cluster Hadoop, Python 3, Spark 1.6, Anaconda
> Reporter: Dat Tran
> Priority: Trivial
> Labels: pyspark, python3, unit-test
> Attachments: map.py
>
>
> The current directory structure for my test script is as follows:
> project/
> script/
> __init__.py
> map.py
> test/
> __init.py__
> test_map.py
> I have attached map.py and test_map.py file with this issue.
> When I run the nosetest in the test directory, the test fails. I get no module named "script" found error.
> However when I modify the map_add function to replace the call to add within reduceByKey in map.py like this:
> def map_add(df):
> result = df.map(lambda x: (x.key, x.value)).reduceByKey(lambda x,y: x+y)
> return result
> The test passes.
> Also, when I run the original test_map.py from the project directory, the test passes.
> I am not able to figure out why the test doesn't detect the script module when it is within the test directory.
> I have also attached the log error file. Any help will be much appreciated.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org