You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kent Yao (JIRA)" <ji...@apache.org> on 2017/11/07 02:41:00 UTC

[jira] [Created] (SPARK-22463) Missing hadoop/hive/hbase/etc configuration files in SPARK_CONF_DIR to distributed archive

Kent Yao created SPARK-22463:
--------------------------------

             Summary: Missing hadoop/hive/hbase/etc configuration files in SPARK_CONF_DIR to distributed archive
                 Key: SPARK-22463
                 URL: https://issues.apache.org/jira/browse/SPARK-22463
             Project: Spark
          Issue Type: Bug
          Components: YARN
    Affects Versions: 2.2.0, 2.1.2
            Reporter: Kent Yao


When I ran self contained sql apps, such as
{code:java}
import org.apache.spark.sql.SparkSession

object ShowHiveTables {
  def main(args: Array[String]): Unit = {
    val spark = SparkSession
      .builder()
      .appName("Show Hive Tables")
      .enableHiveSupport()
      .getOrCreate()
    spark.sql("show tables").show()
    spark.stop()
  }
}
{code}
with **yarn cluster** mode and `hive-site.xml` correctly within `$SPARK_HOME/conf`,they failed to connect the right hive metestore for not seeing hive-site.xml in AM/Driver's classpath.

Although submitting them with `--files/--jars local/path/to/hive-site.xml` or puting it to `$HADOOP_CONF_DIR/YARN_CONF_DIR` can make these apps works well in cluster mode as client mode, according to the official doc, see @ http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables
> Configuration of Hive is done by placing your hive-site.xml, core-site.xml (for security configuration), and hdfs-site.xml (for HDFS configuration) file in conf/.

We may respect these configuration files too or modify the doc for hive-tables in cluster mode.





--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org