You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Jayant Raj <ra...@gmail.com> on 2016/07/18 21:54:28 UTC

Change default interpreter to pyspark

Zeppelin has the Scala interpreter assigned as the default for Spark
notebooks. This default setting creates an additional step if you are to
write code using PySpark. You will need to insert a %pyspark at the
beginning of each row of the notebook, for Zeppelin to understand that this
is PySpark code.

I modified the 'zeppelin.interpreters' property in zeppelin-site.xml and
restarted the Zeppelin process. However, I do not see the default
interpreter change. Am I missing something?

Thanks for any help​

.

Re: Change default interpreter to pyspark

Posted by Ahyoung Ryu <ah...@gmail.com>.
Hi Jayant,

I tested as you said, and can't change the default interpreter as well. So
I created issue for this in ZEPPELIN-1209
<https://issues.apache.org/jira/browse/ZEPPELIN-1209>.
Thanks for reporting the issue.

Best regards,
Ahyoung

2016년 7월 19일 (화) 오전 6:54, Jayant Raj <ra...@gmail.com>님이 작성:

> Zeppelin has the Scala interpreter assigned as the default for Spark
> notebooks. This default setting creates an additional step if you are to
> write code using PySpark. You will need to insert a %pyspark at the
> beginning of each row of the notebook, for Zeppelin to understand that this
> is PySpark code.
>
> I modified the 'zeppelin.interpreters' property in zeppelin-site.xml and
> restarted the Zeppelin process. However, I do not see the default
> interpreter change. Am I missing something?
>
> Thanks for any help​
>
> .
>