You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matt Chapman (JIRA)" <ji...@apache.org> on 2014/12/27 01:50:13 UTC
[jira] [Created] (SPARK-4974) Circular dependency in
pyspark/context.py causes import failure.
Matt Chapman created SPARK-4974:
-----------------------------------
Summary: Circular dependency in pyspark/context.py causes import failure.
Key: SPARK-4974
URL: https://issues.apache.org/jira/browse/SPARK-4974
Project: Spark
Issue Type: Bug
Components: PySpark
Environment: Python 2.7.8
Ubuntu 14.10
Reporter: Matt Chapman
Steps to reproduce:
1. Run a python cli from the 'python/' directory. (Reproduced with default python cli and also ipython.)
2. Run this code:
```
>>> import sys
>>> sys.path.append('build/')
>>> import pyspark.sql
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pyspark/__init__.py", line 63, in <module>
from pyspark.context import SparkContext
File "pyspark/context.py", line 25, in <module>
from pyspark import accumulators
ImportError: cannot import name accumulators
```
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org