You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Daniel Haviv <da...@veracity-group.com> on 2015/07/02 14:41:18 UTC

Starting Spark without automatically starting HiveContext

Hi,
I've downloaded the pre-built binaries for Hadoop 2.6 and whenever I start
the spark-shell it always start with HiveContext.

How can I disable the HiveContext from being initialized automatically ?

Thanks,
Daniel

Re: Starting Spark without automatically starting HiveContext

Posted by ayan guha <gu...@gmail.com>.
Hivecontext should be supersets of SQL context so you should be able to
perform all your tasks. Are you facing any problem with hivecontext?
On 3 Jul 2015 17:33, "Daniel Haviv" <da...@veracity-group.com> wrote:

> Thanks
> I was looking for a less hack-ish way :)
>
> Daniel
>
> On Fri, Jul 3, 2015 at 10:15 AM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> With binary i think it might not be possible, although if you can
>> download the sources and then build it then you can remove this function
>> <https://github.com/apache/spark/blob/master/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1023>
>> which initializes the SQLContext.
>>
>> Thanks
>> Best Regards
>>
>> On Thu, Jul 2, 2015 at 6:11 PM, Daniel Haviv <
>> daniel.haviv@veracity-group.com> wrote:
>>
>>> Hi,
>>> I've downloaded the pre-built binaries for Hadoop 2.6 and whenever I
>>> start the spark-shell it always start with HiveContext.
>>>
>>> How can I disable the HiveContext from being initialized automatically ?
>>>
>>> Thanks,
>>> Daniel
>>>
>>
>>
>

Re: Starting Spark without automatically starting HiveContext

Posted by Daniel Haviv <da...@veracity-group.com>.
The main reason is Spark's startup time and the need to configure a component I don't really need (without  configs the hivecontext takes  more time to load)

Thanks,
Daniel

> On 3 ביולי 2015, at 11:13, Robin East <ro...@xense.co.uk> wrote:
> 
> As Akhil mentioned there isn’t AFAIK any kind of initialisation to stop the SQLContext being created. If you could articulate why you would need to do this (it’s not obvious to me what the benefit would be) then maybe this is something that could be included as a feature in a future release. It may also suggest a way to a workaround.
> 
>> On 3 Jul 2015, at 08:33, Daniel Haviv <da...@veracity-group.com> wrote:
>> 
>> Thanks
>> I was looking for a less hack-ish way :)
>> 
>> Daniel
>> 
>>> On Fri, Jul 3, 2015 at 10:15 AM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
>>> With binary i think it might not be possible, although if you can download the sources and then build it then you can remove this function which initializes the SQLContext.
>>> 
>>> Thanks
>>> Best Regards
>>> 
>>>> On Thu, Jul 2, 2015 at 6:11 PM, Daniel Haviv <da...@veracity-group.com> wrote:
>>>> Hi,
>>>> I've downloaded the pre-built binaries for Hadoop 2.6 and whenever I start the spark-shell it always start with HiveContext.
>>>> 
>>>> How can I disable the HiveContext from being initialized automatically ?
>>>> 
>>>> Thanks,
>>>> Daniel
> 

Re: Starting Spark without automatically starting HiveContext

Posted by Daniel Haviv <da...@veracity-group.com>.
Thanks
I was looking for a less hack-ish way :)

Daniel

On Fri, Jul 3, 2015 at 10:15 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> With binary i think it might not be possible, although if you can download
> the sources and then build it then you can remove this function
> <https://github.com/apache/spark/blob/master/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1023>
> which initializes the SQLContext.
>
> Thanks
> Best Regards
>
> On Thu, Jul 2, 2015 at 6:11 PM, Daniel Haviv <
> daniel.haviv@veracity-group.com> wrote:
>
>> Hi,
>> I've downloaded the pre-built binaries for Hadoop 2.6 and whenever I
>> start the spark-shell it always start with HiveContext.
>>
>> How can I disable the HiveContext from being initialized automatically ?
>>
>> Thanks,
>> Daniel
>>
>
>

Re: Starting Spark without automatically starting HiveContext

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
With binary i think it might not be possible, although if you can download
the sources and then build it then you can remove this function
<https://github.com/apache/spark/blob/master/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1023>
which initializes the SQLContext.

Thanks
Best Regards

On Thu, Jul 2, 2015 at 6:11 PM, Daniel Haviv <
daniel.haviv@veracity-group.com> wrote:

> Hi,
> I've downloaded the pre-built binaries for Hadoop 2.6 and whenever I start
> the spark-shell it always start with HiveContext.
>
> How can I disable the HiveContext from being initialized automatically ?
>
> Thanks,
> Daniel
>