You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Rishikesh Gawade <ri...@gmail.com> on 2018/04/16 15:18:08 UTC

Error: NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT while running a Spark-Hive Job

Hello there,
I am using *spark-2.3.0* compiled using the following Maven Command:
*mvn -Pyarn -Phive -Phive-thriftserver -DskipTests clean install.*
I have configured it to run with *Hive v2.3.3*. Also, all the Hive related
jars (*v1.2.1*) in the Spark's JAR folder have been replaced by all the
JARs available in Hive's *lib* folder and i have also configured Hive to
use *spark* as an *execution engine*.

After that, in my Java application i have written the following lines:

SparkSession spark = SparkSession
        .builder()
        .appName("Java Spark Hive Example")
        .config("hive.metastore.warehouse.dir","/user/hive/warehouse")
        .config("hive.metastore.uris","thrift://hadoopmaster:9083")
        .enableHiveSupport()
        .getOrCreate();
HiveContext hc = new HiveContext(spark);
hc.sql("SHOW DATABASES").show();

I built the project thereafter (no compilation errors) and tried running
the job using spark-submit command as follows:

spark-submit --master yarn --class org.adbms.SpamFilter
IdeaProjects/mlproject/target/mlproject-1.0-SNAPSHOT.jar

On doing so, i received the following error:
Exception in thread "main" java.lang.NoSuchFieldError:
HIVE_STATS_JDBC_TIMEOUT
    at
org.apache.spark.sql.hive.HiveUtils$.formatTimeVarsForHiveClient(HiveUtils.scala:205)
    at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
    at
org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
    at
org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
    at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
    at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
    at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
    at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
    at
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
    and more...


I have no idea what this is about. Is this because Spark v2.3.0 and Hive
v2.3.3 aren't compatible with each other?
If i have done anything wrong, i request you to point it out and suggest me
the required changes. Also, if it's the case that i might have
misconfigured spark and hive, please suggest me the changes in
configuration, a link guiding through all necessary configs would also be
appreciated.
Thank you in anticipation.
Regards,
Rishikesh Gawade

Re: Error: NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT while running a Spark-Hive Job

Posted by rajiv shah <ra...@gigaspaces.com>.
How did you resolved this problem



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org