You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Andrew Onischuk (JIRA)" <ji...@apache.org> on 2015/04/30 10:45:06 UTC

[jira] [Resolved] (AMBARI-10859) hive-site.xml packaged under /etc/spark/conf is not correct

     [ https://issues.apache.org/jira/browse/AMBARI-10859?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Onischuk resolved AMBARI-10859.
--------------------------------------
    Resolution: Fixed

Committed to trunk

> hive-site.xml packaged under /etc/spark/conf is not correct
> -----------------------------------------------------------
>
>                 Key: AMBARI-10859
>                 URL: https://issues.apache.org/jira/browse/AMBARI-10859
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 2.1.0
>
>
> Ambari-2.1.0 for Dal is putting a lot more properties in /etc/spark/conf/hive-
> site.xml than desired. Its leading to unnecessary exceptions while trying to
> load HiveContext on Spark shell. Here is the error:
>     
>     
>     15/04/21 08:37:44 INFO ParseDriver: Parsing command: show tables
>     15/04/21 08:37:44 INFO ParseDriver: Parse Completed
>     java.lang.RuntimeException: java.lang.NumberFormatException: For input string: "5s"
>         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
>         at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:237)
>         at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:233)
>         at scala.Option.orElse(Option.scala:257)
>         at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:233)
>         at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:231)
>         at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:231)
>         at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:231)
>         at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:56)
>         at org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:255)
>         at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:255)
>         at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:255)
>         at org.apache.spark.sql.hive.HiveContext$$anon$4.<init>(HiveContext.scala:265)
>         ....
>     
> In previous Ambari release we were adding only a handful of properties (< 10)
> now 150+ (attached), we should revert to the old behavior.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)