You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Ruslan Dautkhanov <da...@gmail.com> on 2016/11/30 00:49:05 UTC

0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
listed first in zeppelin.interpreters.

zeppelin.interpreters in zeppelin-site.xml:

<property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>



Any ideas how to fix this?


Thanks,
Ruslan

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Ruslan Dautkhanov <da...@gmail.com>.
Jeff,

Yep, that was it.

Thank you!



-- 
Ruslan Dautkhanov

On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang <zj...@gmail.com> wrote:

> Hi Ruslan,
>
> I miss another thing, You also need to delete file conf/interpreter.json
> which store the original setting. Otherwise the original setting is always
> loaded.
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年12月1日周四 上午1:03写道:
>
>> Got it. Thanks Jeff.
>>
>> I've downloaded
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>> and saved to $ZEPPELIN_HOME/interpreter/spark/
>> Then Moved  "defaultInterpreter": true,
>> from json section
>>     "className": "org.apache.zeppelin.spark.SparkInterpreter",
>> to section
>>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>>
>> pySpark is still not default.
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> No, you don't need to create that directory, it should be in
>> $ZEPPELIN_HOME/interpreter/spark
>>
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:
>>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> <property>
>>   <name>zeppelin.interpreters</name>
>>   <value>org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> </property>
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>
>>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Ruslan Dautkhanov <da...@gmail.com>.
I got a lucky jira number :-)

https://issues.apache.org/jira/browse/ZEPPELIN-1777

Thank you Jeff.



-- 
Ruslan Dautkhanov

On Thu, Dec 8, 2016 at 10:50 PM, Jeff Zhang <zj...@gmail.com> wrote:

> hmm, I think so, please file a ticket for it.
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年12月9日周五 下午1:49写道:
>
>> Hi Jeff,
>>
>> When I made pySpark as default - it works as expected;
>> except Setting UI. See screenshot below.
>>
>> Notice it shows %spark twice.
>> First time as default. 2nd one is not.
>> It should have been %pyspark (default), %spark, ..
>> as I made pyspark default.
>>
>> Is this a new bug in 0.7?
>>
>> [image: Inline image 1]
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> Hi Ruslan,
>>
>> I miss another thing, You also need to delete file conf/interpreter.json
>> which store the original setting. Otherwise the original setting is always
>> loaded.
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年12月1日周四 上午1:03写道:
>>
>> Got it. Thanks Jeff.
>>
>> I've downloaded
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>> and saved to $ZEPPELIN_HOME/interpreter/spark/
>> Then Moved  "defaultInterpreter": true,
>> from json section
>>     "className": "org.apache.zeppelin.spark.SparkInterpreter",
>> to section
>>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>>
>> pySpark is still not default.
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> No, you don't need to create that directory, it should be in
>> $ZEPPELIN_HOME/interpreter/spark
>>
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:
>>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> <property>
>>   <name>zeppelin.interpreters</name>
>>   <value>org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> </property>
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>
>>
>>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Jeff Zhang <zj...@gmail.com>.
hmm, I think so, please file a ticket for it.



Ruslan Dautkhanov <da...@gmail.com>于2016年12月9日周五 下午1:49写道:

> Hi Jeff,
>
> When I made pySpark as default - it works as expected;
> except Setting UI. See screenshot below.
>
> Notice it shows %spark twice.
> First time as default. 2nd one is not.
> It should have been %pyspark (default), %spark, ..
> as I made pyspark default.
>
> Is this a new bug in 0.7?
>
> [image: Inline image 1]
>
>
> --
> Ruslan Dautkhanov
>
> On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
> Hi Ruslan,
>
> I miss another thing, You also need to delete file conf/interpreter.json
> which store the original setting. Otherwise the original setting is always
> loaded.
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年12月1日周四 上午1:03写道:
>
> Got it. Thanks Jeff.
>
> I've downloaded
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
> and saved to $ZEPPELIN_HOME/interpreter/spark/
> Then Moved  "defaultInterpreter": true,
> from json section
>     "className": "org.apache.zeppelin.spark.SparkInterpreter",
> to section
>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>
> pySpark is still not default.
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
> No, you don't need to create that directory, it should be in
> $ZEPPELIN_HOME/interpreter/spark
>
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:
>
> Thank you Jeff.
>
> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
> or in $ZEPPELIN_HOME directory?
> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>
> Thanks!
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
> The default interpreter is now defined in interpreter-setting.json
>
> You can update the following file to make pyspark as the default
> interpreter and then copy it to folder interpreter/spark
>
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>
> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>
>
>
>
>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Ruslan Dautkhanov <da...@gmail.com>.
Hi Jeff,

When I made pySpark as default - it works as expected;
except Setting UI. See screenshot below.

Notice it shows %spark twice.
First time as default. 2nd one is not.
It should have been %pyspark (default), %spark, ..
as I made pyspark default.

Is this a new bug in 0.7?

[image: Inline image 1]


-- 
Ruslan Dautkhanov

On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang <zj...@gmail.com> wrote:

> Hi Ruslan,
>
> I miss another thing, You also need to delete file conf/interpreter.json
> which store the original setting. Otherwise the original setting is always
> loaded.
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年12月1日周四 上午1:03写道:
>
>> Got it. Thanks Jeff.
>>
>> I've downloaded
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>> and saved to $ZEPPELIN_HOME/interpreter/spark/
>> Then Moved  "defaultInterpreter": true,
>> from json section
>>     "className": "org.apache.zeppelin.spark.SparkInterpreter",
>> to section
>>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>>
>> pySpark is still not default.
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> No, you don't need to create that directory, it should be in
>> $ZEPPELIN_HOME/interpreter/spark
>>
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:
>>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> <property>
>>   <name>zeppelin.interpreters</name>
>>   <value>org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> </property>
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>
>>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Jeff Zhang <zj...@gmail.com>.
Hi Ruslan,

I miss another thing, You also need to delete file conf/interpreter.json
which store the original setting. Otherwise the original setting is always
loaded.


Ruslan Dautkhanov <da...@gmail.com>于2016年12月1日周四 上午1:03写道:

> Got it. Thanks Jeff.
>
> I've downloaded
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
> and saved to $ZEPPELIN_HOME/interpreter/spark/
> Then Moved  "defaultInterpreter": true,
> from json section
>     "className": "org.apache.zeppelin.spark.SparkInterpreter",
> to section
>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>
> pySpark is still not default.
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
> No, you don't need to create that directory, it should be in
> $ZEPPELIN_HOME/interpreter/spark
>
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:
>
> Thank you Jeff.
>
> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
> or in $ZEPPELIN_HOME directory?
> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>
> Thanks!
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
> The default interpreter is now defined in interpreter-setting.json
>
> You can update the following file to make pyspark as the default
> interpreter and then copy it to folder interpreter/spark
>
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>
> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>
>
>
>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Ruslan Dautkhanov <da...@gmail.com>.
Got it. Thanks Jeff.

I've downloaded
https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
interpreter-setting.json
and saved to $ZEPPELIN_HOME/interpreter/spark/
Then Moved  "defaultInterpreter": true,
from json section
    "className": "org.apache.zeppelin.spark.SparkInterpreter",
to section
    "className": "org.apache.zeppelin.spark.PySparkInterpreter",

pySpark is still not default.



-- 
Ruslan Dautkhanov

On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zj...@gmail.com> wrote:

> No, you don't need to create that directory, it should be in
> $ZEPPELIN_HOME/interpreter/spark
>
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:
>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> <property>
>>   <name>zeppelin.interpreters</name>
>>   <value>org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> </property>
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Jeff Zhang <zj...@gmail.com>.
No, you don't need to create that directory, it should be in
$ZEPPELIN_HOME/interpreter/spark




Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 下午12:12写道:

> Thank you Jeff.
>
> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
> or in $ZEPPELIN_HOME directory?
> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>
> Thanks!
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
> The default interpreter is now defined in interpreter-setting.json
>
> You can update the following file to make pyspark as the default
> interpreter and then copy it to folder interpreter/spark
>
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>
> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>
>
>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Ruslan Dautkhanov <da...@gmail.com>.
Thank you Jeff.

Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
or in $ZEPPELIN_HOME directory?
So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?

Thanks!



-- 
Ruslan Dautkhanov

On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zj...@gmail.com> wrote:

> The default interpreter is now defined in interpreter-setting.json
>
> You can update the following file to make pyspark as the default
> interpreter and then copy it to folder interpreter/spark
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
> interpreter-setting.json
>
>
>
> Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:
>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> <property>
>>   <name>zeppelin.interpreters</name>
>>   <value>org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> </property>
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Jeff Zhang <zj...@gmail.com>.
The default interpreter is now defined in interpreter-setting.json

You can update the following file to make pyspark as the default
interpreter and then copy it to folder interpreter/spark

https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json



Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:

> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

Posted by Jeff Zhang <zj...@gmail.com>.
The default interpreter is now defined in interpreter-setting.json

You can update the following file to make pyspark as the default
interpreter and then copy it to folder interpreter/spark

https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json



Ruslan Dautkhanov <da...@gmail.com>于2016年11月30日周三 上午8:49写道:

> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>