You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by tfrisk <tf...@gmail.com> on 2014/12/27 23:24:45 UTC

Problem with StreamingContext - getting SPARK-2243

Hi,

Doing:
   val ssc = new StreamingContext(conf, Seconds(1))

and getting:
   Only one SparkContext may be running in this JVM (see SPARK-2243). To
ignore this error, set spark.driver.allowMultipleContexts = true.


But I dont think that I have another SparkContext running. Is there any way
I can check this or force kill ?  I've tried restarting the server as I'm
desperate but still I get the same issue.  I was not getting this earlier
today.

Any help much appreciated .....

Thanks,

Thomas




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Problem with StreamingContext - getting SPARK-2243

Posted by Rishi Yadav <ri...@infoobjects.com>.
you can also access SparkConf using sc.getConf in Spark shell though for
StreamingContext you can directly refer sc as Akhil suggested.

On Sun, Dec 28, 2014 at 12:13 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> In the shell you could do:
>
> val ssc = StreamingContext(*sc*, Seconds(1))
>
> as *sc* is the SparkContext, which is already instantiated.
>
> Thanks
> Best Regards
>
> On Sun, Dec 28, 2014 at 6:55 AM, Thomas Frisk <tf...@gmail.com> wrote:
>
>> Yes you are right - thanks for that :)
>>
>> On 27 December 2014 at 23:18, Ilya Ganelin <il...@gmail.com> wrote:
>>
>>> Are you trying to do this in the shell? Shell is instantiated with a
>>> spark context named sc.
>>>
>>> -Ilya Ganelin
>>>
>>> On Sat, Dec 27, 2014 at 5:24 PM, tfrisk <tf...@gmail.com> wrote:
>>>
>>>>
>>>> Hi,
>>>>
>>>> Doing:
>>>>    val ssc = new StreamingContext(conf, Seconds(1))
>>>>
>>>> and getting:
>>>>    Only one SparkContext may be running in this JVM (see SPARK-2243). To
>>>> ignore this error, set spark.driver.allowMultipleContexts = true.
>>>>
>>>>
>>>> But I dont think that I have another SparkContext running. Is there any
>>>> way
>>>> I can check this or force kill ?  I've tried restarting the server as
>>>> I'm
>>>> desperate but still I get the same issue.  I was not getting this
>>>> earlier
>>>> today.
>>>>
>>>> Any help much appreciated .....
>>>>
>>>> Thanks,
>>>>
>>>> Thomas
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: Problem with StreamingContext - getting SPARK-2243

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
In the shell you could do:

val ssc = StreamingContext(*sc*, Seconds(1))

as *sc* is the SparkContext, which is already instantiated.

Thanks
Best Regards

On Sun, Dec 28, 2014 at 6:55 AM, Thomas Frisk <tf...@gmail.com> wrote:

> Yes you are right - thanks for that :)
>
> On 27 December 2014 at 23:18, Ilya Ganelin <il...@gmail.com> wrote:
>
>> Are you trying to do this in the shell? Shell is instantiated with a
>> spark context named sc.
>>
>> -Ilya Ganelin
>>
>> On Sat, Dec 27, 2014 at 5:24 PM, tfrisk <tf...@gmail.com> wrote:
>>
>>>
>>> Hi,
>>>
>>> Doing:
>>>    val ssc = new StreamingContext(conf, Seconds(1))
>>>
>>> and getting:
>>>    Only one SparkContext may be running in this JVM (see SPARK-2243). To
>>> ignore this error, set spark.driver.allowMultipleContexts = true.
>>>
>>>
>>> But I dont think that I have another SparkContext running. Is there any
>>> way
>>> I can check this or force kill ?  I've tried restarting the server as I'm
>>> desperate but still I get the same issue.  I was not getting this earlier
>>> today.
>>>
>>> Any help much appreciated .....
>>>
>>> Thanks,
>>>
>>> Thomas
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>

Re: Problem with StreamingContext - getting SPARK-2243

Posted by Thomas Frisk <tf...@gmail.com>.
Yes you are right - thanks for that :)

On 27 December 2014 at 23:18, Ilya Ganelin <il...@gmail.com> wrote:

> Are you trying to do this in the shell? Shell is instantiated with a spark
> context named sc.
>
> -Ilya Ganelin
>
> On Sat, Dec 27, 2014 at 5:24 PM, tfrisk <tf...@gmail.com> wrote:
>
>>
>> Hi,
>>
>> Doing:
>>    val ssc = new StreamingContext(conf, Seconds(1))
>>
>> and getting:
>>    Only one SparkContext may be running in this JVM (see SPARK-2243). To
>> ignore this error, set spark.driver.allowMultipleContexts = true.
>>
>>
>> But I dont think that I have another SparkContext running. Is there any
>> way
>> I can check this or force kill ?  I've tried restarting the server as I'm
>> desperate but still I get the same issue.  I was not getting this earlier
>> today.
>>
>> Any help much appreciated .....
>>
>> Thanks,
>>
>> Thomas
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: Problem with StreamingContext - getting SPARK-2243

Posted by Ilya Ganelin <il...@gmail.com>.
Are you trying to do this in the shell? Shell is instantiated with a spark
context named sc.

-Ilya Ganelin

On Sat, Dec 27, 2014 at 5:24 PM, tfrisk <tf...@gmail.com> wrote:

>
> Hi,
>
> Doing:
>    val ssc = new StreamingContext(conf, Seconds(1))
>
> and getting:
>    Only one SparkContext may be running in this JVM (see SPARK-2243). To
> ignore this error, set spark.driver.allowMultipleContexts = true.
>
>
> But I dont think that I have another SparkContext running. Is there any way
> I can check this or force kill ?  I've tried restarting the server as I'm
> desperate but still I get the same issue.  I was not getting this earlier
> today.
>
> Any help much appreciated .....
>
> Thanks,
>
> Thomas
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>