You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ran Mingxuan (JIRA)" <ji...@apache.org> on 2017/11/22 07:48:01 UTC
[jira] [Comment Edited] (SPARK-22560) Must create spark session
directly to connect to hive
[ https://issues.apache.org/jira/browse/SPARK-22560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16262093#comment-16262093 ]
Ran Mingxuan edited comment on SPARK-22560 at 11/22/17 7:47 AM:
----------------------------------------------------------------
I think the problem is introduced by org.apache.spark.sql.internal.SessionState which will take the option of sparkcontext. That means the `enableHiveSupport()` won't work in this case if spark session is built from spark context.
{code:java}
// enableHiveSupport() will not be able to be got here
@InterfaceStability.Unstable
@transient
lazy val sessionState: SessionState = {
parentSessionState
.map(_.clone(this))
.getOrElse {
val state = SparkSession.instantiateSessionState(
SparkSession.sessionStateClassName(sparkContext.conf),
self)
initialSessionOptions.foreach { case (k, v) => state.conf.setConfString(k, v) }
state
}
}
{code}
was (Author: ranmx):
I think the problem is introduced by org.apache.spark.sql.internal.SessionState which will take the option of sparkcontext. That means the `enableHiveSupport()` won't work in this case if spark session is built from spark context.
> Must create spark session directly to connect to hive
> -----------------------------------------------------
>
> Key: SPARK-22560
> URL: https://issues.apache.org/jira/browse/SPARK-22560
> Project: Spark
> Issue Type: Bug
> Components: Java API, SQL
> Affects Versions: 2.1.0
> Reporter: Ran Mingxuan
> Original Estimate: 168h
> Remaining Estimate: 168h
>
> In a java project I have to use both JavaSparkContext and SparkSession. I find the order to create them affect hive connection.
> I have built a spark job like below:
> {code:java}
> // wrong code
> public void main(String[] args)
> {
> SparkConf sparkConf = new SparkConf().setAppName("testApp");
> JavaSparkContext sc = new JavaSparkContext(sparkConf);
> SparkSession spark = SparkSession.builder().sparkContext(sc.sc()).enableHiveSupport().getOrCreate();
> spark.sql("show databases").show();
> }
> {code}
> and with this code spark job will not be able to find hive meta-store even if it can discover correct warehouse.
> I have to use code like below to make things work:
> {code:java}
> // correct code
> public String main(String[] args)
> {
> SparkConf sparkConf = new SparkConf().setAppName("testApp");
> SparkSession spark = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();
> SparkContext sparkContext = spark.sparkContext();
> JavaSparkContext sc = JavaSparkContext.fromSparkContext(sparkContext);
> spark.sql("show databases").show();
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org