You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jeanlyn (JIRA)" <ji...@apache.org> on 2015/06/02 04:49:18 UTC

[jira] [Comment Edited] (SPARK-8020) Spark SQL in spark-defaults.conf make metadataHive get constructed too early

    [ https://issues.apache.org/jira/browse/SPARK-8020?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14568427#comment-14568427 ] 

jeanlyn edited comment on SPARK-8020 at 6/2/15 2:48 AM:
--------------------------------------------------------

[~yhuai],I set *spark.sql.hive.metastore.jars* in spark-defaults.conf i got errors like yours.But when i set *spark.sql.hive.metastore.jars* in *hive-set.xml* i got
{code}
5/06/02 10:42:04 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/06/02 10:42:04 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:41416 with 706.6 MB RAM, BlockManagerId(driver, localhost, 41416)
15/06/02 10:42:04 INFO storage.BlockManagerMaster: Registered BlockManager
SET spark.sql.hive.metastore.version=0.12.0
15/06/02 10:42:04 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.u
ris if you are connecting to a remote metastore.
15/06/02 10:42:04 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead
15/06/02 10:42:04 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 0.12.0 using maven.
Ivy Default Cache set to: /home/dd_edw/.ivy2/cache
The jars for the packages stored in: /home/dd_edw/.ivy2/jars
http://www.datanucleus.org/downloads/maven2 added as a remote repository with the name: repo-1
:: loading settings :: url = jar:file:/data0/spark-1.3.0-bin-2.2.0/lib/spark-assembly-1.4.0-SNAPSHOT-hadoop2.2.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.hive#hive-metastore added as a dependency
org.apache.hive#hive-exec added as a dependency
org.apache.hive#hive-common added as a dependency
org.apache.hive#hive-serde added as a dependency
com.google.guava#guava added as a dependency
org.apache.hadoop#hadoop-client added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
       confs: [default]
       found org.apache.hive#hive-metastore;0.12.0 in central
       found org.antlr#antlr;3.4 in central
       found org.antlr#antlr-runtime;3.4 in central
....
xception in thread "main" java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: com/google/common/base/Preconditions when creating Hive client using classpath: fi
le:/tmp/hive3795822184995995241vv12/aopalliance_aopalliance-1.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-exec-0.12.0.jar, file:/tmp/hive3795822184995995
241vv12/org.apache.thrift_libfb303-0.9.0.jar, file:/tmp/hive3795822184995995241vv12/commons-digester_commons-digester-1.8.jar, file:/tmp/hive3795822184995995241vv12/com.sun.je
rsey_jersey-client-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.apache.httpcomponents_httpclient-4.2.5.jar, file:/tmp/hive3795822184995995241vv12/org.antlr_stringtemplat
e-3.2.1.jar, file:/tmp/hive3795822184995995241vv12/commons-logging_commons-logging-1.1.3.jar, file:/tmp/hive3795822184995995241vv12/org.antlr_antlr-runtime-3.4.jar, file:/tmp/
hive3795822184995995241vv12/org.mockito_mockito-all-1.8.2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.derby_derby-10.4.2.0.jar, file:/tmp/hive3795822184995995241vv12
/antlr_antlr-2.7.7.jar, file:/tmp/hive3795822184995995241vv12/commons-net_commons-net-3.1.jar, file:/tmp/hive3795822184995995241vv12/org.slf4j_slf4j-log4j12-1.7.5.jar, file:/t
mp/hive3795822184995995241vv12/junit_junit-3.8.1.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-jaxrs-1.8.8.jar, file:/tmp/hive3795822184995995241vv12
/commons-cli_commons-cli-1.2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-serde-0.12.0.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jettison_jett
ison-1.1.jar, file:/tmp/hive3795822184995995241vv12/javax.xml.stream_stax-api-1.0-2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.avro_avro-1.7.4.jar, file:/tmp/hive37
95822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-app-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-common-2.4.0.jar
, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-xc-1.8.8.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-annotations-2.4.0.jar, file:/
tmp/hive3795822184995995241vv12/org.mortbay.jetty_jetty-util-6.1.26.jar, file:/tmp/hive3795822184995995241vv12/org.apache.commons_commons-math3-3.1.1.jar, file:/tmp/hive379582
2184995995241vv12/javax.transaction_jta-1.1.jar, file:/tmp/hive3795822184995995241vv12/commons-httpclient_commons-httpclient-3.1.jar, file:/tmp/hive3795822184995995241vv12/xml
enc_xmlenc-0.52.jar, file:/tmp/hive3795822184995995241vv12/org.sonatype.sisu.inject_cglib-2.2.1-v20090111.jar, file:/tmp/hive3795822184995995241vv12/com.google.code.findbugs_j
sr305-1.3.9.jar, file:/tmp/hive3795822184995995241vv12/commons-codec_commons-codec-1.4.jar, file:/tmp/hive3795822184995995241vv12/com.google.guava_guava-14.0.1.jar, file:/tmp/
hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-shuffle-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.jboss.netty_netty-3.2.2.Final.jar, file:/tmp
/hive3795822184995995241vv12/org.apache.commons_commons-compress-1.4.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.avro_avro-mapred-1.7.1.jar, file:/tmp/hive37958221
84995995241vv12/org.slf4j_slf4j-api-1.7.5.jar, file:/tmp/hive3795822184995995241vv12/javolution_javolution-5.5.1.jar, file:/tmp/hive3795822184995995241vv12/com.sun.xml.bind_ja
xb-impl-2.2.3-1.jar, file:/tmp/hive3795822184995995241vv12/org.iq80.snappy_snappy-0.2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-yarn-client-2.4.0.jar
, file:/tmp/hive3795822184995995241vv12/log4j_log4j-1.2.17.jar, file:/tmp/hive3795822184995995241vv12/commons-pool_commons-pool-1.5.4.jar, file:/tmp/hive3795822184995995241vv1
2/io.netty_netty-3.4.0.Final.jar, file:/tmp/hive3795822184995995241vv12/org.apache.avro_avro-ipc-1.7.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.zookeeper_zookeepe
r-3.4.3.jar, file:/tmp/hive3795822184995995241vv12/org.json_json-20090211.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-metastore-0.12.0.jar, file:/tmp/hive3
795822184995995241vv12/org.datanucleus_datanucleus-api-jdo-3.2.1.jar, file:/tmp/hive3795822184995995241vv12/org.mortbay.jetty_servlet-api-2.5-20081211.jar, file:/tmp/hive37958
22184995995241vv12/org.apache.hadoop_hadoop-auth-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/javax.xml.bind_jaxb-api-2.2.2.jar, file:/tmp/hive3795822184995995241vv12/com.
sun.jersey_jersey-server-1.9.jar, file:/tmp/hive3795822184995995241vv12/asm_asm-3.2.jar, file:/tmp/hive3795822184995995241vv12/javax.activation_activation-1.1.jar, file:/tmp/h
ive3795822184995995241vv12/org.datanucleus_datanucleus-core-3.2.2.jar, file:/tmp/hive3795822184995995241vv12/com.jolbox_bonecp-0.7.1.RELEASE.jar, file:/tmp/hive379582218499599
5241vv12/org.tukaani_xz-1.0.jar, file:/tmp/hive3795822184995995241vv12/org.mortbay.jetty_jetty-6.1.26.jar, file:/tmp/hive3795822184995995241vv12/com.sun.jersey.contribs_jersey
-guice-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-hdfs-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/commons-collections_commons-collections-3.
2.1.jar, file:/tmp/hive3795822184995995241vv12/commons-beanutils_commons-beanutils-1.7.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-c
ore-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-common-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/com.googlecode.javaewah_JavaEWAH-0.3.2.ja
r, file:/tmp/hive3795822184995995241vv12/com.sun.jersey_jersey-json-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-jobclient-2.4.0.ja
r, file:/tmp/hive3795822184995995241vv12/com.sun.jersey_jersey-core-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.datanucleus_datanucleus-rdbms-3.2.1.jar, file:/tmp/hive3
795822184995995241vv12/javax.jdo_jdo-api-3.0.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-common-0.12.0.jar, file:/tmp/hive3795822184995995241vv12/commons
-beanutils_commons-beanutils-core-1.8.0.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-mapper-asl-1.8.8.jar, file:/tmp/hive3795822184995995241vv12/com
.thoughtworks.paranamer_paranamer-2.3.jar, file:/tmp/hive3795822184995995241vv12/com.google.protobuf_protobuf-java-2.5.0.jar, file:/tmp/hive3795822184995995241vv12/javax.servl
et_servlet-api-2.5.jar, file:/tmp/hive3795822184995995241vv12/org.apache.velocity_velocity-1.7.jar, file:/tmp/hive3795822184995995241vv12/org.apache.thrift_libthrift-0.9.0.jar
, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-yarn-server-common-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/jline_jline-0.9.94.jar, file:/tmp/hive3795
822184995995241vv12/commons-logging_commons-logging-api-1.0.4.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-shims-0.12.0.jar, file:/tmp/hive37958221849959952
41vv12/org.apache.hadoop_hadoop-yarn-api-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/commons-io_commons-io-2.4.jar, file:/tmp/hive3795822184995995241vv12/org.antlr_ST4-4.
0.4.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-core-asl-1.8.8.jar, file:/tmp/hive3795822184995995241vv12/commons-lang_commons-lang-2.6.jar, file:/
tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-yarn-common-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.httpcomponents_httpcore-4.2.5.jar, file:/tmp/h
ive3795822184995995241vv12/org.antlr_antlr-3.4.jar, file:/tmp/hive3795822184995995241vv12/commons-configuration_commons-configuration-1.6.jar, file:/tmp/hive379582218499599524
1vv12/com.google.inject_guice-3.0.jar, file:/tmp/hive3795822184995995241vv12/javax.inject_javax.inject-1.jar, file:/tmp/hive3795822184995995241vv12/org.xerial.snappy_snappy-ja
va-1.0.4.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-client-2.4.0.jar
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
       at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:174)
       at org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:166)
       at org.apache.spark.sql.hive.client.IsolatedClientLoader$.forVersion(IsolatedClientLoader.scala:45)
       at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:213)
       at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:174)
       at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:349)
       at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:188)
       at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:187)
       at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
       at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
       at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:187)
       at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:70)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:248)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:136)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
       at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
       at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
       at java.lang.reflect.Method.invoke(Method.java:597)
       at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
       at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/02 10:42:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
{code} as i show in https://github.com/apache/spark/pull/5876
shall we move this discusstion to user mail list?



was (Author: jeanlyn):
[~yhuai],I set *spark.sql.hive.metastore.jars* in spark-defaults.conf i got errors like yours.But when i set *spark.sql.hive.metastore.jars* in *hive-set.xml* i got
{code}
5/06/02 10:42:04 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/06/02 10:42:04 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:41416 with 706.6 MB RAM, BlockManagerId(driver, localhost, 41416)
15/06/02 10:42:04 INFO storage.BlockManagerMaster: Registered BlockManager
SET spark.sql.hive.metastore.version=0.12.0
15/06/02 10:42:04 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.u
ris if you are connecting to a remote metastore.
15/06/02 10:42:04 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead
15/06/02 10:42:04 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 0.12.0 using maven.
Ivy Default Cache set to: /home/dd_edw/.ivy2/cache
The jars for the packages stored in: /home/dd_edw/.ivy2/jars
http://www.datanucleus.org/downloads/maven2 added as a remote repository with the name: repo-1
:: loading settings :: url = jar:file:/data0/spark-1.3.0-bin-2.2.0/lib/spark-assembly-1.4.0-SNAPSHOT-hadoop2.2.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.hive#hive-metastore added as a dependency
org.apache.hive#hive-exec added as a dependency
org.apache.hive#hive-common added as a dependency
org.apache.hive#hive-serde added as a dependency
com.google.guava#guava added as a dependency
org.apache.hadoop#hadoop-client added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
       confs: [default]
       found org.apache.hive#hive-metastore;0.12.0 in central
       found org.antlr#antlr;3.4 in central
       found org.antlr#antlr-runtime;3.4 in central
....
xception in thread "main" java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: com/google/common/base/Preconditions when creating Hive client using classpath: fi
le:/tmp/hive3795822184995995241vv12/aopalliance_aopalliance-1.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-exec-0.12.0.jar, file:/tmp/hive3795822184995995
241vv12/org.apache.thrift_libfb303-0.9.0.jar, file:/tmp/hive3795822184995995241vv12/commons-digester_commons-digester-1.8.jar, file:/tmp/hive3795822184995995241vv12/com.sun.je
rsey_jersey-client-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.apache.httpcomponents_httpclient-4.2.5.jar, file:/tmp/hive3795822184995995241vv12/org.antlr_stringtemplat
e-3.2.1.jar, file:/tmp/hive3795822184995995241vv12/commons-logging_commons-logging-1.1.3.jar, file:/tmp/hive3795822184995995241vv12/org.antlr_antlr-runtime-3.4.jar, file:/tmp/
hive3795822184995995241vv12/org.mockito_mockito-all-1.8.2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.derby_derby-10.4.2.0.jar, file:/tmp/hive3795822184995995241vv12
/antlr_antlr-2.7.7.jar, file:/tmp/hive3795822184995995241vv12/commons-net_commons-net-3.1.jar, file:/tmp/hive3795822184995995241vv12/org.slf4j_slf4j-log4j12-1.7.5.jar, file:/t
mp/hive3795822184995995241vv12/junit_junit-3.8.1.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-jaxrs-1.8.8.jar, file:/tmp/hive3795822184995995241vv12
/commons-cli_commons-cli-1.2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-serde-0.12.0.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jettison_jett
ison-1.1.jar, file:/tmp/hive3795822184995995241vv12/javax.xml.stream_stax-api-1.0-2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.avro_avro-1.7.4.jar, file:/tmp/hive37
95822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-app-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-common-2.4.0.jar
, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-xc-1.8.8.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-annotations-2.4.0.jar, file:/
tmp/hive3795822184995995241vv12/org.mortbay.jetty_jetty-util-6.1.26.jar, file:/tmp/hive3795822184995995241vv12/org.apache.commons_commons-math3-3.1.1.jar, file:/tmp/hive379582
2184995995241vv12/javax.transaction_jta-1.1.jar, file:/tmp/hive3795822184995995241vv12/commons-httpclient_commons-httpclient-3.1.jar, file:/tmp/hive3795822184995995241vv12/xml
enc_xmlenc-0.52.jar, file:/tmp/hive3795822184995995241vv12/org.sonatype.sisu.inject_cglib-2.2.1-v20090111.jar, file:/tmp/hive3795822184995995241vv12/com.google.code.findbugs_j
sr305-1.3.9.jar, file:/tmp/hive3795822184995995241vv12/commons-codec_commons-codec-1.4.jar, file:/tmp/hive3795822184995995241vv12/com.google.guava_guava-14.0.1.jar, file:/tmp/
hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-shuffle-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.jboss.netty_netty-3.2.2.Final.jar, file:/tmp
/hive3795822184995995241vv12/org.apache.commons_commons-compress-1.4.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.avro_avro-mapred-1.7.1.jar, file:/tmp/hive37958221
84995995241vv12/org.slf4j_slf4j-api-1.7.5.jar, file:/tmp/hive3795822184995995241vv12/javolution_javolution-5.5.1.jar, file:/tmp/hive3795822184995995241vv12/com.sun.xml.bind_ja
xb-impl-2.2.3-1.jar, file:/tmp/hive3795822184995995241vv12/org.iq80.snappy_snappy-0.2.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-yarn-client-2.4.0.jar
, file:/tmp/hive3795822184995995241vv12/log4j_log4j-1.2.17.jar, file:/tmp/hive3795822184995995241vv12/commons-pool_commons-pool-1.5.4.jar, file:/tmp/hive3795822184995995241vv1
2/io.netty_netty-3.4.0.Final.jar, file:/tmp/hive3795822184995995241vv12/org.apache.avro_avro-ipc-1.7.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.zookeeper_zookeepe
r-3.4.3.jar, file:/tmp/hive3795822184995995241vv12/org.json_json-20090211.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-metastore-0.12.0.jar, file:/tmp/hive3
795822184995995241vv12/org.datanucleus_datanucleus-api-jdo-3.2.1.jar, file:/tmp/hive3795822184995995241vv12/org.mortbay.jetty_servlet-api-2.5-20081211.jar, file:/tmp/hive37958
22184995995241vv12/org.apache.hadoop_hadoop-auth-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/javax.xml.bind_jaxb-api-2.2.2.jar, file:/tmp/hive3795822184995995241vv12/com.
sun.jersey_jersey-server-1.9.jar, file:/tmp/hive3795822184995995241vv12/asm_asm-3.2.jar, file:/tmp/hive3795822184995995241vv12/javax.activation_activation-1.1.jar, file:/tmp/h
ive3795822184995995241vv12/org.datanucleus_datanucleus-core-3.2.2.jar, file:/tmp/hive3795822184995995241vv12/com.jolbox_bonecp-0.7.1.RELEASE.jar, file:/tmp/hive379582218499599
5241vv12/org.tukaani_xz-1.0.jar, file:/tmp/hive3795822184995995241vv12/org.mortbay.jetty_jetty-6.1.26.jar, file:/tmp/hive3795822184995995241vv12/com.sun.jersey.contribs_jersey
-guice-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-hdfs-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/commons-collections_commons-collections-3.
2.1.jar, file:/tmp/hive3795822184995995241vv12/commons-beanutils_commons-beanutils-1.7.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-c
ore-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-common-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/com.googlecode.javaewah_JavaEWAH-0.3.2.ja
r, file:/tmp/hive3795822184995995241vv12/com.sun.jersey_jersey-json-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-mapreduce-client-jobclient-2.4.0.ja
r, file:/tmp/hive3795822184995995241vv12/com.sun.jersey_jersey-core-1.9.jar, file:/tmp/hive3795822184995995241vv12/org.datanucleus_datanucleus-rdbms-3.2.1.jar, file:/tmp/hive3
795822184995995241vv12/javax.jdo_jdo-api-3.0.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-common-0.12.0.jar, file:/tmp/hive3795822184995995241vv12/commons
-beanutils_commons-beanutils-core-1.8.0.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-mapper-asl-1.8.8.jar, file:/tmp/hive3795822184995995241vv12/com
.thoughtworks.paranamer_paranamer-2.3.jar, file:/tmp/hive3795822184995995241vv12/com.google.protobuf_protobuf-java-2.5.0.jar, file:/tmp/hive3795822184995995241vv12/javax.servl
et_servlet-api-2.5.jar, file:/tmp/hive3795822184995995241vv12/org.apache.velocity_velocity-1.7.jar, file:/tmp/hive3795822184995995241vv12/org.apache.thrift_libthrift-0.9.0.jar
, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-yarn-server-common-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/jline_jline-0.9.94.jar, file:/tmp/hive3795
822184995995241vv12/commons-logging_commons-logging-api-1.0.4.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hive_hive-shims-0.12.0.jar, file:/tmp/hive37958221849959952
41vv12/org.apache.hadoop_hadoop-yarn-api-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/commons-io_commons-io-2.4.jar, file:/tmp/hive3795822184995995241vv12/org.antlr_ST4-4.
0.4.jar, file:/tmp/hive3795822184995995241vv12/org.codehaus.jackson_jackson-core-asl-1.8.8.jar, file:/tmp/hive3795822184995995241vv12/commons-lang_commons-lang-2.6.jar, file:/
tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-yarn-common-2.4.0.jar, file:/tmp/hive3795822184995995241vv12/org.apache.httpcomponents_httpcore-4.2.5.jar, file:/tmp/h
ive3795822184995995241vv12/org.antlr_antlr-3.4.jar, file:/tmp/hive3795822184995995241vv12/commons-configuration_commons-configuration-1.6.jar, file:/tmp/hive379582218499599524
1vv12/com.google.inject_guice-3.0.jar, file:/tmp/hive3795822184995995241vv12/javax.inject_javax.inject-1.jar, file:/tmp/hive3795822184995995241vv12/org.xerial.snappy_snappy-ja
va-1.0.4.1.jar, file:/tmp/hive3795822184995995241vv12/org.apache.hadoop_hadoop-client-2.4.0.jar
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
       at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:174)
       at org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:166)
       at org.apache.spark.sql.hive.client.IsolatedClientLoader$.forVersion(IsolatedClientLoader.scala:45)
       at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:213)
       at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:174)
       at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:349)
       at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:188)
       at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:187)
       at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
       at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
       at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:187)
       at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:70)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:248)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:136)
       at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
       at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
       at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
       at java.lang.reflect.Method.invoke(Method.java:597)
       at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
       at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/02 10:42:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
{code}


> Spark SQL in spark-defaults.conf make metadataHive get constructed too early
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-8020
>                 URL: https://issues.apache.org/jira/browse/SPARK-8020
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>            Priority: Critical
>
> To correctly construct a {{metadataHive}} object, we need two settings, {{spark.sql.hive.metastore.version}} and {{spark.sql.hive.metastore.jars}}. If users want to use Hive 0.12's metastore, they need to set {{spark.sql.hive.metastore.version}} to {{0.12.0}} and set {{spark.sql.hive.metastore.jars}} to {{maven}} or a classpath containing Hive and Hadoop's jars. However, any spark sql setting in the {{spark-defaults.conf}} will trigger the construction of {{metadataHive}} and cause Spark SQL connect to the wrong metastore (e.g. connect to the local derby metastore instead of a remove mysql Hive 0.12 metastore). Also, if {{spark.sql.hive.metastore.version 0.12.0}} is the first conf set to SQL conf, we will get
> {code}
> Exception in thread "main" java.lang.IllegalArgumentException: Builtin jars can only be used when hive execution version == hive metastore version. Execution: 0.13.1 != Metastore: 0.12.0. Specify a vaild path to the correct hive jars using $HIVE_METASTORE_JARS or change spark.sql.hive.metastore.version to 0.13.1.
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:186)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:175)
> 	at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:358)
> 	at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:186)
> 	at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:185)
> 	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> 	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
> 	at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:185)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:248)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:136)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org