You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2016/11/23 17:56:58 UTC

[jira] [Resolved] (SPARK-18050) spark 2.0.1 enable hive throw AlreadyExistsException(message:Database default already exists)

     [ https://issues.apache.org/jira/browse/SPARK-18050?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Or resolved SPARK-18050.
-------------------------------
          Resolution: Fixed
       Fix Version/s: 2.1.0
    Target Version/s: 2.1.0

> spark 2.0.1 enable hive throw AlreadyExistsException(message:Database default already exists)
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-18050
>                 URL: https://issues.apache.org/jira/browse/SPARK-18050
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>         Environment: jdk1.8, macOs,spark 2.0.1
>            Reporter: todd.chen
>            Assignee: Wenchen Fan
>             Fix For: 2.1.0
>
>
> in spark 2.0.1 ,I enable hive support and when init the sqlContext ,throw a AlreadyExistsException(message:Database default already exists),same as 
> https://www.mail-archive.com/dev@spark.apache.org/msg15306.html ,my code is 
> {code}
>   private val master = "local[*]"
>   private val appName = "xqlServerSpark"
>   val fileSystem = FileSystem.get()
>   val sparkConf = new SparkConf().setMaster(master).
>     setAppName(appName).set("spark.sql.warehouse.dir", s"${fileSystem.getUri.toASCIIString}/user/hive/warehouse")
>   val   hiveContext = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate().sqlContext
>     print(sparkConf.get("spark.sql.warehouse.dir"))
>     hiveContext.sql("show tables").show()
> {code}
> the result is correct,but a exception also throwBy the code



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org