You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2015/09/15 23:16:46 UTC

[jira] [Resolved] (SPARK-4758) Make metastore_db in-memory for HiveContext

     [ https://issues.apache.org/jira/browse/SPARK-4758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust resolved SPARK-4758.
-------------------------------------
    Resolution: Won't Fix

I'm going to close this stale ticket.  Please reopen if you intend to work on it in the context of speeding up tests.

> Make metastore_db in-memory for HiveContext
> -------------------------------------------
>
>                 Key: SPARK-4758
>                 URL: https://issues.apache.org/jira/browse/SPARK-4758
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.2.0, 1.3.0
>            Reporter: Jianshi Huang
>            Priority: Minor
>
> HiveContext by default will create a local folder metastore_db.
> This is not very user friendly as the metastore_db will be locked by HiveContext and thus will block multiple Spark process to start from the same directory.
> I would propose adding a default hive-site.xml in conf/ with the following content.
> <configuration>
>   <property>
>     <name>javax.jdo.option.ConnectionURL</name>
>     <value>jdbc:derby:memory:databaseName=metastore_db;create=true</value>
>   </property>
>   <property>
>     <name>javax.jdo.option.ConnectionDriverName</name>
>     <value>org.apache.derby.jdbc.EmbeddedDriver</value>
>   </property>
>   <property>
>     <name>hive.metastore.warehouse.dir</name>
>     <value>file://${user.dir}/hive/warehouse</value>
>   </property>
> </configuration>
> jdbc:derby:memory:databaseName=metastore_db;create=true Will make sure the embedded derby database is created in-memory.
> Jianshi



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org