You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Andrew Lee <al...@hotmail.com> on 2015/03/13 20:43:44 UTC

Spark ThriftServer encounter java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

When Kerberos is enabled, I get the following exceptions. (Spark 1.2.1 git commit 








b6eaf77d4332bfb0a698849b1f5f917d20d70e97, Hive 0.13.1, Apache Hadoop 2.4.1) when starting Spark ThriftServer.
Command to start thriftserver
./start-thriftserver.sh --hiveconf hive.server2.thrift.port=20000 --hiveconf hive.server2.thrift.bind.host=$(hostname) --master yarn-client
Error message in spark.log

2015-03-13 18:26:05,363 ERROR org.apache.hive.service.cli.thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(93)) - Error: 
java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]
        at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
        at org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
        at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
        at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
        at java.lang.Thread.run(Thread.java:744)

I'm wondering if this is due to the same problem described in HIVE-8154 HIVE-7620 due to an older code based for the Spark ThriftServer?
Any insights are appreciated. Currently, I can't get Spark ThriftServer to run against a Kerberos cluster (Apache 2.4.1).

My hive-site.xml looks like the following for spark/conf.








<property>
  <name>hive.semantic.analyzer.factory.impl</name>
  <value>org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory</value>
</property>
<property>
  <name>hive.metastore.execute.setugi</name>
  <value>true</value>
</property>
<property>
  <name>hive.stats.autogather</name>
  <value>false</value>
</property>
<property>
  <name>hive.session.history.enabled</name>
  <value>true</value>
</property>
<property>
  <name>hive.querylog.location</name>
  <value>/home/hive/log/${user.name}</value>
</property>
<property>
  <name>hive.exec.local.scratchdir</name>
  <value>/tmp/hive/scratch/${user.name}</value>
</property>
<property>
  <name>hive.metastore.uris</name>
  <value>thrift://somehostname:9083</value>
</property>
<!-- HIVE SERVER 2 -->
<property>
  <name>hive.server2.authentication</name>
  <value>KERBEROS</value>
</property>
<property>
  <name>hive.server2.authentication.kerberos.principal</name>
  <value>***</value>
</property>
<property>
  <name>hive.server2.authentication.kerberos.keytab</name>
  <value>***</value>
</property>
<property>
  <name>hive.server2.thrift.sasl.qop</name>
  <value>auth</value>
  <description>Sasl QOP value; one of 'auth', 'auth-int' and 'auth-conf'</description>
</property>
<property>
  <name>hive.server2.enable.impersonation</name>
  <description>Enable user impersonation for HiveServer2</description>
  <value>true</value>
</property>
<!-- HIVE METASTORE -->
<property>
  <name>hive.metastore.sasl.enabled</name>
  <value>true</value>
</property>
<property>
  <name>hive.metastore.kerberos.keytab.file</name>
  <value>***</value>
</property>
<property>
  <name>hive.metastore.kerberos.principal</name>
  <value>***</value>
</property>
<property>
  <name>hive.metastore.cache.pinobjtypes</name>
  <value>Table,Database,Type,FieldSchema,Order</value>
</property>
<property>
  <name>hdfs_sentinel_file</name>
  <value>***</value>
</property>
<property>
  <name>hive.metastore.warehouse.dir</name>
  <value>/hive</value>
</property>
<property>
  <name>hive.metastore.client.socket.timeout</name>
  <value>600</value>
</property>
<property>
  <name>hive.warehouse.subdir.inherit.perms</name>
  <value>true</value>
</property> 		 	   		  

Re: Spark ThriftServer encounter java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

Posted by Cheng Lian <li...@gmail.com>.
Yeah, SQL is the right component. Thanks!

Cheng

On 4/14/15 12:47 AM, Andrew Lee wrote:
> Hi Cheng,
>
> I couldn't find the component for Spark ThriftServer, will that be 
> 'SQL' component?
>
> JIRA created.
> https://issues.apache.org/jira/browse/SPARK-6882
>
>
> > Date: Sun, 15 Mar 2015 21:03:34 +0800
> > From: lian.cs.zju@gmail.com
> > To: alee526@hotmail.com; dev@spark.apache.org
> > Subject: Re: Spark ThriftServer encounter 
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed 
> values are: [auth-int, auth-conf, auth]
> >
> > Hey Andrew,
> >
> > Would you please create a JIRA ticket for this? To preserve
> > compatibility with existing Hive JDBC/ODBC drivers, Spark SQL's
> > HiveThriftServer intercepts some HiveServer2 components and injects
> > Spark stuff into it. This makes the implementation details are somewhat
> > hacky (e.g. a bunch of reflection tricks were used). We haven't include
> > KRB tests in Spark unit/integration test suites, and it's possible that
> > HiveThriftServer2 somehow breaks Hive's KRB feature.
> >
> > Cheng
> >
> > On 3/14/15 3:43 AM, Andrew Lee wrote:
> > > When Kerberos is enabled, I get the following exceptions. (Spark 
> 1.2.1 git commit
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > > b6eaf77d4332bfb0a698849b1f5f917d20d70e97, Hive 0.13.1, Apache 
> Hadoop 2.4.1) when starting Spark ThriftServer.
> > > Command to start thriftserver
> > > ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=20000 
> --hiveconf hive.server2.thrift.bind.host=$(hostname) --master yarn-client
> > > Error message in spark.log
> > >
> > > 2015-03-13 18:26:05,363 ERROR 
> org.apache.hive.service.cli.thrift.ThriftCLIService 
> (ThriftBinaryCLIService.java:run(93)) - Error:
> > > java.lang.IllegalArgumentException: Unknown auth type: null 
> Allowed values are: [auth-int, auth-conf, auth]
> > > at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
> > > at 
> org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
> > > at 
> org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
> > > at 
> org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
> > > at java.lang.Thread.run(Thread.java:744)
> > >
> > > I'm wondering if this is due to the same problem described in 
> HIVE-8154 HIVE-7620 due to an older code based for the Spark ThriftServer?
> > > Any insights are appreciated. Currently, I can't get Spark 
> ThriftServer to run against a Kerberos cluster (Apache 2.4.1).
> > >
> > > My hive-site.xml looks like the following for spark/conf.
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > > <property>
> > > <name>hive.semantic.analyzer.factory.impl</name>
> > > <value>org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.execute.setugi</name>
> > > <value>true</value>
> > > </property>
> > > <property>
> > > <name>hive.stats.autogather</name>
> > > <value>false</value>
> > > </property>
> > > <property>
> > > <name>hive.session.history.enabled</name>
> > > <value>true</value>
> > > </property>
> > > <property>
> > > <name>hive.querylog.location</name>
> > > <value>/home/hive/log/${user.name}</value>
> > > </property>
> > > <property>
> > > <name>hive.exec.local.scratchdir</name>
> > > <value>/tmp/hive/scratch/${user.name}</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.uris</name>
> > > <value>thrift://somehostname:9083</value>
> > > </property>
> > > <!-- HIVE SERVER 2 -->
> > > <property>
> > > <name>hive.server2.authentication</name>
> > > <value>KERBEROS</value>
> > > </property>
> > > <property>
> > > <name>hive.server2.authentication.kerberos.principal</name>
> > > <value>***</value>
> > > </property>
> > > <property>
> > > <name>hive.server2.authentication.kerberos.keytab</name>
> > > <value>***</value>
> > > </property>
> > > <property>
> > > <name>hive.server2.thrift.sasl.qop</name>
> > > <value>auth</value>
> > > <description>Sasl QOP value; one of 'auth', 'auth-int' and 
> 'auth-conf'</description>
> > > </property>
> > > <property>
> > > <name>hive.server2.enable.impersonation</name>
> > > <description>Enable user impersonation for HiveServer2</description>
> > > <value>true</value>
> > > </property>
> > > <!-- HIVE METASTORE -->
> > > <property>
> > > <name>hive.metastore.sasl.enabled</name>
> > > <value>true</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.kerberos.keytab.file</name>
> > > <value>***</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.kerberos.principal</name>
> > > <value>***</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.cache.pinobjtypes</name>
> > > <value>Table,Database,Type,FieldSchema,Order</value>
> > > </property>
> > > <property>
> > > <name>hdfs_sentinel_file</name>
> > > <value>***</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.warehouse.dir</name>
> > > <value>/hive</value>
> > > </property>
> > > <property>
> > > <name>hive.metastore.client.socket.timeout</name>
> > > <value>600</value>
> > > </property>
> > > <property>
> > > <name>hive.warehouse.subdir.inherit.perms</name>
> > > <value>true</value>
> > > </property>
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >


RE: Spark ThriftServer encounter java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

Posted by Andrew Lee <al...@hotmail.com>.
Hi Cheng,
I couldn't find the component for Spark ThriftServer, will that be 'SQL' component?
JIRA created.https://issues.apache.org/jira/browse/SPARK-6882

> Date: Sun, 15 Mar 2015 21:03:34 +0800
> From: lian.cs.zju@gmail.com
> To: alee526@hotmail.com; dev@spark.apache.org
> Subject: Re: Spark ThriftServer encounter java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]
> 
> Hey Andrew,
> 
> Would you please create a JIRA ticket for this? To preserve 
> compatibility with existing Hive JDBC/ODBC drivers, Spark SQL's 
> HiveThriftServer intercepts some HiveServer2 components and injects 
> Spark stuff into it. This makes the implementation details are somewhat 
> hacky (e.g. a bunch of reflection tricks were used). We haven't include 
> KRB tests in Spark unit/integration test suites, and it's possible that 
> HiveThriftServer2 somehow breaks Hive's KRB feature.
> 
> Cheng
> 
> On 3/14/15 3:43 AM, Andrew Lee wrote:
> > When Kerberos is enabled, I get the following exceptions. (Spark 1.2.1 git commit
> >
> >
> >
> >
> >
> >
> >
> >
> > b6eaf77d4332bfb0a698849b1f5f917d20d70e97, Hive 0.13.1, Apache Hadoop 2.4.1) when starting Spark ThriftServer.
> > Command to start thriftserver
> > ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=20000 --hiveconf hive.server2.thrift.bind.host=$(hostname) --master yarn-client
> > Error message in spark.log
> >
> > 2015-03-13 18:26:05,363 ERROR org.apache.hive.service.cli.thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(93)) - Error:
> > java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]
> >          at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
> >          at org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
> >          at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
> >          at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
> >          at java.lang.Thread.run(Thread.java:744)
> >
> > I'm wondering if this is due to the same problem described in HIVE-8154 HIVE-7620 due to an older code based for the Spark ThriftServer?
> > Any insights are appreciated. Currently, I can't get Spark ThriftServer to run against a Kerberos cluster (Apache 2.4.1).
> >
> > My hive-site.xml looks like the following for spark/conf.
> >
> >
> >
> >
> >
> >
> >
> >
> > <property>
> >    <name>hive.semantic.analyzer.factory.impl</name>
> >    <value>org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory</value>
> > </property>
> > <property>
> >    <name>hive.metastore.execute.setugi</name>
> >    <value>true</value>
> > </property>
> > <property>
> >    <name>hive.stats.autogather</name>
> >    <value>false</value>
> > </property>
> > <property>
> >    <name>hive.session.history.enabled</name>
> >    <value>true</value>
> > </property>
> > <property>
> >    <name>hive.querylog.location</name>
> >    <value>/home/hive/log/${user.name}</value>
> > </property>
> > <property>
> >    <name>hive.exec.local.scratchdir</name>
> >    <value>/tmp/hive/scratch/${user.name}</value>
> > </property>
> > <property>
> >    <name>hive.metastore.uris</name>
> >    <value>thrift://somehostname:9083</value>
> > </property>
> > <!-- HIVE SERVER 2 -->
> > <property>
> >    <name>hive.server2.authentication</name>
> >    <value>KERBEROS</value>
> > </property>
> > <property>
> >    <name>hive.server2.authentication.kerberos.principal</name>
> >    <value>***</value>
> > </property>
> > <property>
> >    <name>hive.server2.authentication.kerberos.keytab</name>
> >    <value>***</value>
> > </property>
> > <property>
> >    <name>hive.server2.thrift.sasl.qop</name>
> >    <value>auth</value>
> >    <description>Sasl QOP value; one of 'auth', 'auth-int' and 'auth-conf'</description>
> > </property>
> > <property>
> >    <name>hive.server2.enable.impersonation</name>
> >    <description>Enable user impersonation for HiveServer2</description>
> >    <value>true</value>
> > </property>
> > <!-- HIVE METASTORE -->
> > <property>
> >    <name>hive.metastore.sasl.enabled</name>
> >    <value>true</value>
> > </property>
> > <property>
> >    <name>hive.metastore.kerberos.keytab.file</name>
> >    <value>***</value>
> > </property>
> > <property>
> >    <name>hive.metastore.kerberos.principal</name>
> >    <value>***</value>
> > </property>
> > <property>
> >    <name>hive.metastore.cache.pinobjtypes</name>
> >    <value>Table,Database,Type,FieldSchema,Order</value>
> > </property>
> > <property>
> >    <name>hdfs_sentinel_file</name>
> >    <value>***</value>
> > </property>
> > <property>
> >    <name>hive.metastore.warehouse.dir</name>
> >    <value>/hive</value>
> > </property>
> > <property>
> >    <name>hive.metastore.client.socket.timeout</name>
> >    <value>600</value>
> > </property>
> > <property>
> >    <name>hive.warehouse.subdir.inherit.perms</name>
> >    <value>true</value>
> > </property> 		 	   		
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 
 		 	   		  

Re: Spark ThriftServer encounter java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

Posted by gogototo <wa...@gmail.com>.
I think it's the hive 0.13.1 issue, which fixed in hive 0.14.
https://issues.apache.org/jira/browse/HIVE-6741
shall you please release some artifact of org.spark-project.hive 0.14 above
?
Thx very much!



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-ThriftServer-encounter-java-lang-IllegalArgumentException-Unknown-auth-type-null-Allowed-value-tp11048p13130.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark ThriftServer encounter java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]

Posted by Cheng Lian <li...@gmail.com>.
Hey Andrew,

Would you please create a JIRA ticket for this? To preserve 
compatibility with existing Hive JDBC/ODBC drivers, Spark SQL's 
HiveThriftServer intercepts some HiveServer2 components and injects 
Spark stuff into it. This makes the implementation details are somewhat 
hacky (e.g. a bunch of reflection tricks were used). We haven't include 
KRB tests in Spark unit/integration test suites, and it's possible that 
HiveThriftServer2 somehow breaks Hive's KRB feature.

Cheng

On 3/14/15 3:43 AM, Andrew Lee wrote:
> When Kerberos is enabled, I get the following exceptions. (Spark 1.2.1 git commit
>
>
>
>
>
>
>
>
> b6eaf77d4332bfb0a698849b1f5f917d20d70e97, Hive 0.13.1, Apache Hadoop 2.4.1) when starting Spark ThriftServer.
> Command to start thriftserver
> ./start-thriftserver.sh --hiveconf hive.server2.thrift.port=20000 --hiveconf hive.server2.thrift.bind.host=$(hostname) --master yarn-client
> Error message in spark.log
>
> 2015-03-13 18:26:05,363 ERROR org.apache.hive.service.cli.thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(93)) - Error:
> java.lang.IllegalArgumentException: Unknown auth type: null Allowed values are: [auth-int, auth-conf, auth]
>          at org.apache.hive.service.auth.SaslQOP.fromString(SaslQOP.java:56)
>          at org.apache.hive.service.auth.HiveAuthFactory.getSaslProperties(HiveAuthFactory.java:118)
>          at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:133)
>          at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)
>          at java.lang.Thread.run(Thread.java:744)
>
> I'm wondering if this is due to the same problem described in HIVE-8154 HIVE-7620 due to an older code based for the Spark ThriftServer?
> Any insights are appreciated. Currently, I can't get Spark ThriftServer to run against a Kerberos cluster (Apache 2.4.1).
>
> My hive-site.xml looks like the following for spark/conf.
>
>
>
>
>
>
>
>
> <property>
>    <name>hive.semantic.analyzer.factory.impl</name>
>    <value>org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory</value>
> </property>
> <property>
>    <name>hive.metastore.execute.setugi</name>
>    <value>true</value>
> </property>
> <property>
>    <name>hive.stats.autogather</name>
>    <value>false</value>
> </property>
> <property>
>    <name>hive.session.history.enabled</name>
>    <value>true</value>
> </property>
> <property>
>    <name>hive.querylog.location</name>
>    <value>/home/hive/log/${user.name}</value>
> </property>
> <property>
>    <name>hive.exec.local.scratchdir</name>
>    <value>/tmp/hive/scratch/${user.name}</value>
> </property>
> <property>
>    <name>hive.metastore.uris</name>
>    <value>thrift://somehostname:9083</value>
> </property>
> <!-- HIVE SERVER 2 -->
> <property>
>    <name>hive.server2.authentication</name>
>    <value>KERBEROS</value>
> </property>
> <property>
>    <name>hive.server2.authentication.kerberos.principal</name>
>    <value>***</value>
> </property>
> <property>
>    <name>hive.server2.authentication.kerberos.keytab</name>
>    <value>***</value>
> </property>
> <property>
>    <name>hive.server2.thrift.sasl.qop</name>
>    <value>auth</value>
>    <description>Sasl QOP value; one of 'auth', 'auth-int' and 'auth-conf'</description>
> </property>
> <property>
>    <name>hive.server2.enable.impersonation</name>
>    <description>Enable user impersonation for HiveServer2</description>
>    <value>true</value>
> </property>
> <!-- HIVE METASTORE -->
> <property>
>    <name>hive.metastore.sasl.enabled</name>
>    <value>true</value>
> </property>
> <property>
>    <name>hive.metastore.kerberos.keytab.file</name>
>    <value>***</value>
> </property>
> <property>
>    <name>hive.metastore.kerberos.principal</name>
>    <value>***</value>
> </property>
> <property>
>    <name>hive.metastore.cache.pinobjtypes</name>
>    <value>Table,Database,Type,FieldSchema,Order</value>
> </property>
> <property>
>    <name>hdfs_sentinel_file</name>
>    <value>***</value>
> </property>
> <property>
>    <name>hive.metastore.warehouse.dir</name>
>    <value>/hive</value>
> </property>
> <property>
>    <name>hive.metastore.client.socket.timeout</name>
>    <value>600</value>
> </property>
> <property>
>    <name>hive.warehouse.subdir.inherit.perms</name>
>    <value>true</value>
> </property> 		 	   		


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org