You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Bharathkrishna Guruvayoor Murali (JIRA)" <ji...@apache.org> on 2018/04/24 23:34:00 UTC

[jira] [Commented] (HIVE-18958) Fix Spark config warnings

    [ https://issues.apache.org/jira/browse/HIVE-18958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16451405#comment-16451405 ] 

Bharathkrishna Guruvayoor Murali commented on HIVE-18958:
---------------------------------------------------------

Pending removing the warning forĀ configuration key 'spark.yarn.driver.memoryOverhead'

{color:#333333}Submitting the patch to run tests and also get review for the removal of code that adds HIVE_SPARK_RSC_CONFIGS to SparkConf in HiveSparkClientFactory.{color}

> Fix Spark config warnings
> -------------------------
>
>                 Key: HIVE-18958
>                 URL: https://issues.apache.org/jira/browse/HIVE-18958
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Bharathkrishna Guruvayoor Murali
>            Priority: Major
>         Attachments: HIVE-18958.01.patch
>
>
> Getting a few configuration warnings in the logs that we should fix:
> {code}
> 2018-03-14T10:06:19,164  WARN [d5ade9e4-9354-40f1-8f74-631f373709b3 main] spark.SparkConf: The configuration key 'spark.yarn.driver.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.driver.memoryOverhead' instead.
> 2018-03-14T10:06:19,165  WARN [d5ade9e4-9354-40f1-8f74-631f373709b3 main] spark.SparkConf: The configuration key spark.akka.logLifecycleEvents is not supported any more because Spark doesn't use Akka since 2.0
> 2018-03-14T10:06:19,165  WARN [d5ade9e4-9354-40f1-8f74-631f373709b3 main] spark.SparkConf: The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead.
> 2018-03-14T10:06:20,351  INFO [RemoteDriver-stderr-redir-d5ade9e4-9354-40f1-8f74-631f373709b3 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000
> 2018-03-14T10:06:20,351  INFO [RemoteDriver-stderr-redir-d5ade9e4-9354-40f1-8f74-631f373709b3 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 2018-03-14T10:06:20,351  INFO [RemoteDriver-stderr-redir-d5ade9e4-9354-40f1-8f74-631f373709b3 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=30000
> 2018-03-14T10:06:20,351  INFO [RemoteDriver-stderr-redir-d5ade9e4-9354-40f1-8f74-631f373709b3 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 2018-03-14T10:06:20,351  INFO [RemoteDriver-stderr-redir-d5ade9e4-9354-40f1-8f74-631f373709b3 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)