You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ZhouDaHong (Jira)" <ji...@apache.org> on 2020/08/14 04:21:00 UTC

[jira] [Commented] (SPARK-32345) SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session

    [ https://issues.apache.org/jira/browse/SPARK-32345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17177468#comment-17177468 ] 

ZhouDaHong commented on SPARK-32345:
------------------------------------

If the cause of the version conflict is excluded. You can look at queue resources.

If the queue resource reaches 100% and there is no free task resource released for creating spark session in a short time, the task will fail and this exception will be thrown.

Solution: increase the connection time interval of hive client to 15 minutes;

set hive.spark.client . server.connect.timeout=900000 ;

> SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session
> --------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-32345
>                 URL: https://issues.apache.org/jira/browse/SPARK-32345
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: 任建亭
>            Priority: Blocker
>
>  when using hive on spark engine:
>     FAILED: SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session
> hadoop version: 2.7.3 / hive version: 3.1.2 / spark version 3.0.0
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org