You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/07/14 17:19:20 UTC
[jira] [Resolved] (SPARK-16528) HiveClientImpl throws NPE when
reading database from a custom metastore
[ https://issues.apache.org/jira/browse/SPARK-16528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin resolved SPARK-16528.
---------------------------------
Resolution: Fixed
Assignee: Jacek Lewandowski
Fix Version/s: 2.1.0
2.0.1
> HiveClientImpl throws NPE when reading database from a custom metastore
> -----------------------------------------------------------------------
>
> Key: SPARK-16528
> URL: https://issues.apache.org/jira/browse/SPARK-16528
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Jacek Lewandowski
> Assignee: Jacek Lewandowski
> Fix For: 2.0.1, 2.1.0
>
>
> In _HiveClientImpl_ there is a method to create database:
> {code}
> override def createDatabase(
> database: CatalogDatabase,
> ignoreIfExists: Boolean): Unit = withHiveState {
> client.createDatabase(
> new HiveDatabase(
> database.name,
> database.description,
> database.locationUri,
> database.properties.asJava),
> ignoreIfExists)
> }
> {code}
> The problem is that it assumes that {{database.properties}} is a not null value which is not always the truth. In fact, when the database is created, in _HiveMetaStore_ we have:
> {code}
> private void createDefaultDB_core(RawStore ms) throws MetaException, InvalidObjectException {
> try {
> ms.getDatabase(DEFAULT_DATABASE_NAME);
> } catch (NoSuchObjectException e) {
> Database db = new Database(DEFAULT_DATABASE_NAME, DEFAULT_DATABASE_COMMENT,
> wh.getDefaultDatabasePath(DEFAULT_DATABASE_NAME).toString(), null);
> db.setOwnerName(PUBLIC);
> db.setOwnerType(PrincipalType.ROLE);
> ms.createDatabase(db);
> }
> }
> {code}
> As you can see, parameters field is set to {{null}}.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org