You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Du Li <li...@yahoo-inc.com> on 2014/07/10 02:58:03 UTC

Re: error in creating external table

Hi,

I got an error when trying to create an external table with location on a remote HDFS address.

I meant to quickly try out the basic features on spark SQL 1.0-JDBC and so started the thrift server on one terminal and beehive CLI on another. Didn’t do any extra configuration on spark sql, hive or hadoop. So the warehouse and metastore_db are by default created on the local file system. However, my setup of shark (0.8.1 on spark 0.8.1, hive 0.9 and hadoop 0.23.10) was able to create the same external table without any issue. Hadoop-ls was also able to list files under the same specified HDFS location.

The error message from the thrift server printed as follows.


Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.) (state=,code=0)

Any suggestion would be appreciated.

Du