You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2015/10/31 05:32:27 UTC
[jira] [Created] (ZEPPELIN-379) Pyspark needs to warn user when
sqlContext is overwritten
Felix Cheung created ZEPPELIN-379:
-------------------------------------
Summary: Pyspark needs to warn user when sqlContext is overwritten
Key: ZEPPELIN-379
URL: https://issues.apache.org/jira/browse/ZEPPELIN-379
Project: Zeppelin
Issue Type: Bug
Components: Interpreters
Affects Versions: 0.6.0
Reporter: Felix Cheung
Priority: Minor
I can this can be a fairly big usability problem we need know how to warn users about it.
From: Matt Sochor
Reply-To: <us...@zeppelin.incubator.apache.org>
Date: Thursday, October 29, 2015 at 3:19 PM
To: <us...@zeppelin.incubator.apache.org>
Subject: Re: pyspark with jar
I actually *just* figured it out. Zeppelin has sqlContext "already created and exposed" (https://zeppelin.incubator.apache.org/docs/interpreter/spark.html).
So when I do "sqlContext = SQLContext(sc)" I overwrite sqlContext. Then Zeppelin cannot see this new sqlContext.
Anyway, anyone out there experiencing this problem, do NOT initialize sqlContext and it works fine.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)