You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Lian (JIRA)" <ji...@apache.org> on 2015/06/02 08:27:18 UTC

[jira] [Created] (SPARK-8031) Version number written to Hive metastore is "0.13.1aa" instead of "0.13.1a"

Cheng Lian created SPARK-8031:
---------------------------------

             Summary: Version number written to Hive metastore is "0.13.1aa" instead of "0.13.1a"
                 Key: SPARK-8031
                 URL: https://issues.apache.org/jira/browse/SPARK-8031
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.1, 1.3.0, 1.2.2, 1.2.1, 1.2.0, 1.4.0
            Reporter: Cheng Lian
            Priority: Trivial


While debugging {{CliSuite}} for 1.4.0-SNAPSHOT, noticed the following WARN log line:
{noformat}
15/06/02 13:40:29 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
{noformat}
The problem is that, the version of Hive dependencies 1.4.0-SNAPSHOT uses is {{0.13.1a}} (the one shaded by [~pwendell]), but the version showed in this line is {{0.13.1aa}} (one more {{a}}). The WARN log itself is OK since {{CliSuite}} initializes a brand new temporary Derby metastore.

While initializing Hive metastore, Hive calls {{ObjectStore.checkSchema()}} and may write the "short" version string to metastore. This short version string is defined by {{hive.version.shortname}} in the POM. However, [it was defined as {{0.13.1aa}}|https://github.com/pwendell/hive/commit/32e515907f0005c7a28ee388eadd1c94cf99b2d4#diff-600376dffeb79835ede4a0b285078036R62]. Confirmed with [~pwendell] that it should be a typo.

This doesn't cause any trouble for now, but we probably want to fix this in the future if we ever need to release another shaded version of Hive 0.13.1.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org