You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nitin Goyal (JIRA)" <ji...@apache.org> on 2015/05/04 06:12:06 UTC

[jira] [Created] (SPARK-7331) Create HiveConf per application instead of per query in HiveQl.scala

Nitin Goyal created SPARK-7331:
----------------------------------

             Summary: Create HiveConf per application instead of per query in HiveQl.scala
                 Key: SPARK-7331
                 URL: https://issues.apache.org/jira/browse/SPARK-7331
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 1.3.0, 1.2.0
            Reporter: Nitin Goyal
            Priority: Minor


A new HiveConf is created per query in getAst method in HiveQl.scala

  def getAst(sql: String): ASTNode = {
    /*
     * Context has to be passed in hive0.13.1.
     * Otherwise, there will be Null pointer exception,
     * when retrieving properties form HiveConf.
     */
    val hContext = new Context(new HiveConf())
    val node = ParseUtils.findRootNonNullToken((new ParseDriver).parse(sql, hContext))
    hContext.clear()
    node
  }

Creating hiveConf adds a minimum of 90ms delay per query. So moving its creation in Object such that it gets initialised once.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org