You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Chandraprakash Bhagtani <cp...@gmail.com> on 2016/05/23 11:41:13 UTC

Spark job is failing with kerberos error while creating hive context in yarn-cluster mode (through spark-submit)

Hi,

My Spark job is failing with kerberos issues while creating hive context in
yarn-cluster mode. However it is running with yarn-client mode. My spark
version is 1.6.1

I am passing hive-site.xml through --files option.

I tried searching online and found that the same issue is fixed with the
following jira SPARK-6207. it is fixed in spark 1.4, but I am running 1.6.1

Am i missing any configuration here?


-- 
Thanks & Regards,
Chandra Prakash Bhagtani

Re: Spark job is failing with kerberos error while creating hive context in yarn-cluster mode (through spark-submit)

Posted by Chandraprakash Bhagtani <cp...@gmail.com>.
Thanks Doug,

I have all the 4 configs (mentioned by you) already in my hive-site.xml. Do
I need to create a hive-site.xml in spark conf directory (it is not there
by default in 1.6.1)? Please suggest.


On Mon, May 23, 2016 at 9:53 PM, Doug Balog <do...@dugos.com>
wrote:

> I have a custom  hive-site.xml for spark in sparks conf directory.
> These properties are the minimal ones that you need for spark, I believe.
>
> hive.metastore.kerberos.principal = copy from your hive-site.xml,  i.e.
> "hive/_HOST@FOO.COM"
> hive.metastore.uris = copy from your hive-site.xml,  i.e. thrift://
> ms1.foo.com:9083
> hive.metastore.sasl.enabled = true
> hive.security.authorization.enabled = false
>
> Cheers,
>
> Doug
>
>
>
> > On May 23, 2016, at 7:41 AM, Chandraprakash Bhagtani <
> cpbhagtani@gmail.com> wrote:
> >
> > Hi,
> >
> > My Spark job is failing with kerberos issues while creating hive context
> in yarn-cluster mode. However it is running with yarn-client mode. My spark
> version is 1.6.1
> >
> > I am passing hive-site.xml through --files option.
> >
> > I tried searching online and found that the same issue is fixed with the
> following jira SPARK-6207. it is fixed in spark 1.4, but I am running 1.6.1
> >
> > Am i missing any configuration here?
> >
> >
> > --
> > Thanks & Regards,
> > Chandra Prakash Bhagtani
>
>


-- 
Thanks & Regards,
Chandra Prakash Bhagtani

Re: Spark job is failing with kerberos error while creating hive context in yarn-cluster mode (through spark-submit)

Posted by Doug Balog <do...@dugos.com>.
I have a custom  hive-site.xml for spark in sparks conf directory.
These properties are the minimal ones that you need for spark, I believe.

hive.metastore.kerberos.principal = copy from your hive-site.xml,  i.e.  "hive/_HOST@FOO.COM"
hive.metastore.uris = copy from your hive-site.xml,  i.e. thrift://ms1.foo.com:9083
hive.metastore.sasl.enabled = true
hive.security.authorization.enabled = false

Cheers,

Doug



> On May 23, 2016, at 7:41 AM, Chandraprakash Bhagtani <cp...@gmail.com> wrote:
> 
> Hi,
> 
> My Spark job is failing with kerberos issues while creating hive context in yarn-cluster mode. However it is running with yarn-client mode. My spark version is 1.6.1
> 
> I am passing hive-site.xml through --files option. 
> 
> I tried searching online and found that the same issue is fixed with the following jira SPARK-6207. it is fixed in spark 1.4, but I am running 1.6.1
> 
> Am i missing any configuration here?
> 
> 
> -- 
> Thanks & Regards,
> Chandra Prakash Bhagtani


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark job is failing with kerberos error while creating hive context in yarn-cluster mode (through spark-submit)

Posted by Chandraprakash Bhagtani <cp...@gmail.com>.
Thanks, It worked !!!

On Tue, May 24, 2016 at 1:14 AM, Marcelo Vanzin <va...@cloudera.com> wrote:

> On Mon, May 23, 2016 at 4:41 AM, Chandraprakash Bhagtani
> <cp...@gmail.com> wrote:
> > I am passing hive-site.xml through --files option.
>
> You need hive-site-xml in Spark's classpath too. Easiest way is to
> copy / symlink hive-site.xml in your Spark's conf directory.
>
> --
> Marcelo
>



-- 
Thanks & Regards,
Chandra Prakash Bhagtani

Re: Spark job is failing with kerberos error while creating hive context in yarn-cluster mode (through spark-submit)

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Mon, May 23, 2016 at 4:41 AM, Chandraprakash Bhagtani
<cp...@gmail.com> wrote:
> I am passing hive-site.xml through --files option.

You need hive-site-xml in Spark's classpath too. Easiest way is to
copy / symlink hive-site.xml in your Spark's conf directory.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark job is failing with kerberos error while creating hive context in yarn-cluster mode (through spark-submit)

Posted by Ted Yu <yu...@gmail.com>.
Can you describe the kerberos issues in more detail ?

Which release of YARN are you using ?

Cheers

On Mon, May 23, 2016 at 4:41 AM, Chandraprakash Bhagtani <
cpbhagtani@gmail.com> wrote:

> Hi,
>
> My Spark job is failing with kerberos issues while creating hive context
> in yarn-cluster mode. However it is running with yarn-client mode. My spark
> version is 1.6.1
>
> I am passing hive-site.xml through --files option.
>
> I tried searching online and found that the same issue is fixed with the
> following jira SPARK-6207. it is fixed in spark 1.4, but I am running 1.6.1
>
> Am i missing any configuration here?
>
>
> --
> Thanks & Regards,
> Chandra Prakash Bhagtani
>