You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hudi.apache.org by selvaraj periyasamy <se...@gmail.com> on 2020/05/26 16:52:41 UTC

Re: Hudi 0.5.0 -> Hive JDBC call fails

Hi Vinoth,



I see the below comment in Hudi code. How can I start using metastore
client for hive registrations? is there a way to disable useJdbc flag?



*// Support both JDBC and metastore based implementations for backwards
compatiblity. Future users should// disable jdbc and depend on metastore
client for all hive registrations*



Below is my log. It makes hive JDBC connection and failing due to method
not available error.


20/05/26 15:38:15 INFO HoodieSparkSqlWriter$: Syncing to Hive Metastore
(URL: jdbc:hive2://server1.visa.com:2181,server2.visa.com:2181,
server3.visa.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
)

20/05/26 15:38:15 INFO FSUtils: Hadoop Configuration: fs.defaultFS:
[hdfs://oprhqanameservice], Config:[Configuration: core-default.xml,
core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml,
yarn-site.xml, hdfs-default.xml, hdfs-site.xml, __spark_hadoop_conf__.xml,
file:/etc/spark2/2.6.5.179-4/0/hive-site.xml], FileSystem:
[DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1153590032_1, ugi=
svchdc36q@VISA.COM (auth:KERBEROS)]]]

20/05/26 15:38:15 INFO HiveConf: Found configuration file
file:/etc/spark2/2.6.5.179-4/0/hive-site.xml

20/05/26 15:38:16 INFO HoodieTableMetaClient: Loading HoodieTableMetaClient
from /projects/cdp/data/cdp_reporting/trr

20/05/26 15:38:16 INFO FSUtils: Hadoop Configuration: fs.defaultFS:
[hdfs://oprhqanameservice], Config:[Configuration: core-default.xml,
core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml,
yarn-site.xml, hdfs-default.xml, hdfs-site.xml, __spark_hadoop_conf__.xml,
file:/etc/spark2/2.6.5.179-4/0/hive-site.xml], FileSystem:
[DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1153590032_1, ugi=
svchdc36q@VISA.COM (auth:KERBEROS)]]]

20/05/26 15:38:16 INFO HoodieTableConfig: Loading dataset properties from
/projects/cdp/data/cdp_reporting/trr/.hoodie/hoodie.properties

20/05/26 15:38:16 INFO HoodieTableMetaClient: Finished Loading Table of
type COPY_ON_WRITE from /projects/cdp/data/cdp_reporting/trr

20/05/26 15:38:16 INFO HoodieTableMetaClient: Loading Active commit
timeline for /projects/cdp/data/cdp_reporting/trr

20/05/26 15:38:16 INFO HoodieActiveTimeline: Loaded instants
java.util.stream.ReferencePipeline$Head@a1fca5a

20/05/26 15:38:16 INFO HoodieHiveClient: Creating hive connection
jdbc:hive2://server1.visa.com:2181,server2.visa.com:2181,
server3.visa.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

20/05/26 15:38:16 INFO Utils: Supplied authorities: server1.visa.com:2181,
server2.visa.com:2181,server3.visa.com:2181

20/05/26 15:38:16 INFO CuratorFrameworkImpl: Starting

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:zookeeper.version=3.4.6-4--1, built on 08/09/2019 23:18 GMT

20/05/26 15:38:16 INFO ZooKeeper: Client environment:host.name=
server4.visa.com

20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.version=1.8.0_241

20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.vendor=Oracle
Corporation

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:java.home=/usr/java/jdk1.8.0_241-amd64/jre

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:java.class.path=/usr/hdp/current/spark2-client/conf/:/usr/hdp/current/spark2-client/jars/hk2-api-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/JavaEWAH-0.3.2.jar:/usr/hdp/current/spark2-client/jars/commons-pool-1.5.4.jar:/usr/hdp/current/spark2-client/jars/RoaringBitmap-0.5.11.jar:/usr/hdp/current/spark2-client/jars/hk2-locator-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/ST4-4.0.4.jar:/usr/hdp/current/spark2-client/jars/compress-lzf-1.0.3.jar:/usr/hdp/current/spark2-client/jars/activation-1.1.1.jar:/usr/hdp/current/spark2-client/jars/core-1.1.2.jar:/usr/hdp/current/spark2-client/jars/aircompressor-0.8.jar:/usr/hdp/current/spark2-client/jars/hk2-utils-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/antlr-2.7.7.jar:/usr/hdp/current/spark2-client/jars/curator-client-2.7.1.jar:/usr/hdp/current/spark2-client/jars/antlr-runtime-3.4.jar:/usr/hdp/current/spark2-client/jars/curator-framework-2.7.1.jar:/usr/hdp/current/spark2-client/jars/antlr4-runtime-4.7.jar:/usr/hdp/current/spark2-client/jars/ivy-2.4.0.jar:/usr/hdp/current/spark2-client/jars/aopalliance-1.0.jar:/usr/hdp/current/spark2-client/jars/commons-io-2.4.jar:/usr/hdp/current/spark2-client/jars/janino-3.0.8.jar:/usr/hdp/current/spark2-client/jars/aopalliance-repackaged-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/commons-collections-3.2.2.jar:/usr/hdp/current/spark2-client/jars/apache-log4j-extras-1.2.17.jar:/usr/hdp/current/spark2-client/jars/curator-recipes-2.7.1.jar:/usr/hdp/current/spark2-client/jars/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/spark2-client/jars/commons-cli-1.2.jar:/usr/hdp/current/spark2-client/jars/javax.inject-1.jar:/usr/hdp/current/spark2-client/jars/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/spark2-client/jars/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/current/spark2-client/jars/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/spark2-client/jars/datanucleus-core-3.2.10.jar:/usr/hdp/current/spark2-client/jars/api-util-1.0.0-M20.jar:/usr/hdp/current/spark2-client/jars/datanucleus-rdbms-3.2.9.jar:/usr/hdp/current/spark2-client/jars/arpack_combined_all-0.1.jar:/usr/hdp/current/spark2-client/jars/derby-10.12.1.1.jar:/usr/hdp/current/spark2-client/jars/arrow-format-0.8.0.jar:/usr/hdp/current/spark2-client/jars/eigenbase-properties-1.1.5.jar:/usr/hdp/current/spark2-client/jars/arrow-memory-0.8.0.jar:/usr/hdp/current/spark2-client/jars/flatbuffers-1.2.0-3f79e055.jar:/usr/hdp/current/spark2-client/jars/arrow-vector-0.8.0.jar:/usr/hdp/current/spark2-client/jars/hppc-0.7.2.jar:/usr/hdp/current/spark2-client/jars/avro-1.7.7.jar:/usr/hdp/current/spark2-client/jars/httpclient-4.5.2.jar:/usr/hdp/current/spark2-client/jars/avro-ipc-1.7.7.jar:/usr/hdp/current/spark2-client/jars/commons-compiler-3.0.8.jar:/usr/hdp/current/spark2-client/jars/avro-mapred-1.7.7-hadoop2.jar:/usr/hdp/current/spark2-client/jars/commons-compress-1.4.1.jar:/usr/hdp/current/spark2-client/jars/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/spark2-client/jars/guava-14.0.1.jar:/usr/hdp/current/spark2-client/jars/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/spark2-client/jars/gson-2.2.4.jar:/usr/hdp/current/spark2-client/jars/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/spark2-client/jars/commons-configuration-1.6.jar:/usr/hdp/current/spark2-client/jars/azure-data-lake-store-sdk-2.1.4.jar:/usr/hdp/current/spark2-client/jars/commons-lang-2.6.jar:/usr/hdp/current/spark2-client/jars/jpam-1.1.jar:/usr/hdp/current/spark2-client/jars/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/spark2-client/jars/guice-3.0.jar:/usr/hdp/current/spark2-client/jars/azure-storage-5.4.0.jar:/usr/hdp/current/spark2-client/jars/httpcore-4.4.4.jar:/usr/hdp/current/spark2-client/jars/base64-2.3.8.jar:/usr/hdp/current/spark2-client/jars/guice-servlet-3.0.jar:/usr/hdp/current/spark2-client/jars/bcprov-jdk15on-1.58.jar:/usr/hdp/current/spark2-client/jars/hadoop-aws-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/bonecp-0.8.0.RELEASE.jar:/usr/hdp/current/spark2-client/jars/commons-lang3-3.5.jar:/usr/hdp/current/spark2-client/jars/breeze-macros_2.11-0.13.2.jar:/usr/hdp/current/spark2-client/jars/hadoop-auth-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/breeze_2.11-0.13.2.jar:/usr/hdp/current/spark2-client/jars/commons-codec-1.10.jar:/usr/hdp/current/spark2-client/jars/jta-1.1.jar:/usr/hdp/current/spark2-client/jars/calcite-avatica-1.2.0-incubating.jar:/usr/hdp/current/spark2-client/jars/commons-logging-1.1.3.jar:/usr/hdp/current/spark2-client/jars/calcite-core-1.2.0-incubating.jar:/usr/hdp/current/spark2-client/jars/commons-math3-3.4.1.jar:/usr/hdp/current/spark2-client/jars/calcite-linq4j-1.2.0-incubating.jar:/usr/hdp/current/spark2-client/jars/hadoop-azure-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/chill-java-0.8.4.jar:/usr/hdp/current/spark2-client/jars/hadoop-client-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/chill_2.11-0.8.4.jar:/usr/hdp/current/spark2-client/jars/hadoop-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-beanutils-1.7.0.jar:/usr/hdp/current/spark2-client/jars/gcs-connector-1.8.1.2.6.5.179-4-shaded.jar:/usr/hdp/current/spark2-client/jars/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/spark2-client/jars/commons-crypto-1.0.0.jar:/usr/hdp/current/spark2-client/jars/hadoop-hdfs-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-dbcp-1.4.jar:/usr/hdp/current/spark2-client/jars/hive-beeline-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-digester-1.8.jar:/usr/hdp/current/spark2-client/jars/hive-cli-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-httpclient-3.1.jar:/usr/hdp/current/spark2-client/jars/jackson-core-2.6.7.jar:/usr/hdp/current/spark2-client/jars/commons-net-2.2.jar:/usr/hdp/current/spark2-client/jars/javolution-5.5.1.jar:/usr/hdp/current/spark2-client/jars/jersey-server-2.22.2.jar:/usr/hdp/current/spark2-client/jars/xz-1.0.jar:/usr/hdp/current/spark2-client/jars/hadoop-annotations-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jets3t-0.9.4.jar:/usr/hdp/current/spark2-client/jars/okhttp-2.7.5.jar:/usr/hdp/current/spark2-client/jars/hadoop-azure-datalake-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-container-servlet-2.22.2.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-app-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jdo-api-3.0.1.jar:/usr/hdp/current/spark2-client/jars/libfb303-0.9.3.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-container-servlet-core-2.22.2.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-core-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-client-2.22.2.jar:/usr/hdp/current/spark2-client/jars/netty-3.9.9.Final.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-jobclient-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-common-2.22.2.jar:/usr/hdp/current/spark2-client/jars/netty-all-4.1.17.Final.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-shuffle-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jetty-6.1.26.hwx.jar:/usr/hdp/current/spark2-client/jars/okio-1.6.0.jar:/usr/hdp/current/spark2-client/jars/hadoop-openstack-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-api-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-client-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jline-2.12.1.jar:/usr/hdp/current/spark2-client/jars/opencsv-2.3.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/joda-time-2.9.3.jar:/usr/hdp/current/spark2-client/jars/oro-2.0.8.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-registry-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-guava-2.22.2.jar:/usr/hdp/current/spark2-client/jars/paranamer-2.8.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-server-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-media-jaxb-2.22.2.jar:/usr/hdp/current/spark2-client/jars/py4j-0.10.6.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-server-web-proxy-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/json4s-ast_2.11-3.2.11.jar:/usr/hdp/current/spark2-client/jars/hive-exec-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/json4s-core_2.11-3.2.11.jar:/usr/hdp/current/spark2-client/jars/hive-jdbc-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jodd-core-3.5.2.jar:/usr/hdp/current/spark2-client/jars/pyrolite-4.13.jar:/usr/hdp/current/spark2-client/jars/hive-metastore-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/json4s-jackson_2.11-3.2.11.jar:/usr/hdp/current/spark2-client/jars/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/spark2-client/jars/jsp-api-2.1.jar:/usr/hdp/current/spark2-client/jars/jackson-annotations-2.6.7.jar:/usr/hdp/current/spark2-client/jars/libthrift-0.9.3.jar:/usr/hdp/current/spark2-client/jars/jackson-core-asl-1.9.13.jar:/usr/hdp/current/spark2-client/jars/jsr305-1.3.9.jar:/usr/hdp/current/spark2-client/jars/jackson-databind-2.6.7.1.jar:/usr/hdp/current/spark2-client/jars/jtransforms-2.4.0.jar:/usr/hdp/current/spark2-client/jars/jackson-dataformat-cbor-2.6.7.jar:/usr/hdp/current/spark2-client/jars/log4j-1.2.17.jar:/usr/hdp/current/spark2-client/jars/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/spark2-client/jars/jul-to-slf4j-1.7.16.jar:/usr/hdp/current/spark2-client/jars/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/spark2-client/jars/kryo-shaded-3.0.3.jar:/usr/hdp/current/spark2-client/jars/jackson-module-paranamer-2.7.9.jar:/usr/hdp/current/spark2-client/jars/json-smart-1.3.1.jar:/usr/hdp/current/spark2-client/jars/scalap-2.11.8.jar:/usr/hdp/current/spark2-client/jars/jackson-module-scala_2.11-2.6.7.1.jar:/usr/hdp/current/spark2-client/jars/lz4-java-1.4.0.jar:/usr/hdp/current/spark2-client/jars/jackson-xc-1.9.13.jar:/usr/hdp/current/spark2-client/jars/machinist_2.11-0.6.1.jar:/usr/hdp/current/spark2-client/jars/java-xmlbuilder-1.1.jar:/usr/hdp/current/spark2-client/jars/macro-compat_2.11-1.1.1.jar:/usr/hdp/current/spark2-client/jars/javassist-3.18.1-GA.jar:/usr/hdp/current/spark2-client/jars/leveldbjni-all-1.8.jar:/usr/hdp/current/spark2-client/jars/javax.annotation-api-1.2.jar:/usr/hdp/current/spark2-client/jars/metrics-core-3.1.5.jar:/usr/hdp/current/spark2-client/jars/javax.inject-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/metrics-graphite-3.1.5.jar:/usr/hdp/current/spark2-client/jars/javax.servlet-api-3.1.0.jar:/usr/hdp/current/spark2-client/jars/metrics-json-3.1.5.jar:/usr/hdp/current/spark2-client/jars/javax.ws.rs-api-2.0.1.jar:/usr/hdp/current/spark2-client/jars/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/current/spark2-client/jars/jaxb-api-2.2.2.jar:/usr/hdp/current/spark2-client/jars/metrics-jvm-3.1.5.jar:/usr/hdp/current/spark2-client/jars/jcip-annotations-1.0-1.jar:/usr/hdp/current/spark2-client/jars/minlog-1.3.0.jar:/usr/hdp/current/spark2-client/jars/jcl-over-slf4j-1.7.16.jar:/usr/hdp/current/spark2-client/jars/parquet-column-1.8.2.jar:/usr/hdp/current/spark2-client/jars/objenesis-2.1.jar:/usr/hdp/current/spark2-client/jars/spire_2.11-0.13.0.jar:/usr/hdp/current/spark2-client/jars/orc-core-1.4.3.2.6.5.179-4-nohive.jar:/usr/hdp/current/spark2-client/jars/stax-api-1.0-2.jar:/usr/hdp/current/spark2-client/jars/orc-mapreduce-1.4.3.2.6.5.179-4-nohive.jar:/usr/hdp/current/spark2-client/jars/osgi-resource-locator-1.0.1.jar:/usr/hdp/current/spark2-client/jars/parquet-common-1.8.2.jar:/usr/hdp/current/spark2-client/jars/parquet-encoding-1.8.2.jar:/usr/hdp/current/spark2-client/jars/parquet-format-2.3.1.jar:/usr/hdp/current/spark2-client/jars/parquet-hadoop-1.8.2.jar:/usr/hdp/current/spark2-client/jars/parquet-hadoop-bundle-1.6.0.jar:/usr/hdp/current/spark2-client/jars/parquet-jackson-1.8.2.jar:/usr/hdp/current/spark2-client/jars/protobuf-java-2.5.0.jar:/usr/hdp/current/spark2-client/jars/scala-compiler-2.11.8.jar:/usr/hdp/current/spark2-client/jars/scala-library-2.11.8.jar:/usr/hdp/current/spark2-client/jars/stax-api-1.0.1.jar:/usr/hdp/current/spark2-client/jars/scala-parser-combinators_2.11-1.0.4.jar:/usr/hdp/current/spark2-client/jars/scala-reflect-2.11.8.jar:/usr/hdp/current/spark2-client/jars/scala-xml_2.11-1.0.5.jar:/usr/hdp/current/spark2-client/jars/shapeless_2.11-2.3.2.jar:/usr/hdp/current/spark2-client/jars/slf4j-api-1.7.16.jar:/usr/hdp/current/spark2-client/jars/slf4j-log4j12-1.7.16.jar:/usr/hdp/current/spark2-client/jars/snappy-0.2.jar:/usr/hdp/current/spark2-client/jars/snappy-java-1.1.2.6.jar:/usr/hdp/current/spark2-client/jars/stream-2.7.0.jar:/usr/hdp/current/spark2-client/jars/spark-catalyst_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/stringtemplate-3.2.1.jar:/usr/hdp/current/spark2-client/jars/spark-cloud_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/super-csv-2.2.0.jar:/usr/hdp/current/spark2-client/jars/spark-core_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/univocity-parsers-2.5.9.jar:/usr/hdp/current/spark2-client/jars/spark-graphx_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-unsafe_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-hadoop-cloud_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-tags_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-hive-thriftserver_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/validation-api-1.1.0.Final.jar:/usr/hdp/current/spark2-client/jars/spark-hive_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/xbean-asm5-shaded-4.4.jar:/usr/hdp/current/spark2-client/jars/spark-kvstore_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/xercesImpl-2.9.1.jar:/usr/hdp/current/spark2-client/jars/spark-launcher_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-mllib-local_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/xmlenc-0.52.jar:/usr/hdp/current/spark2-client/jars/spark-mllib_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-yarn_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-network-common_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spire-macros_2.11-0.13.0.jar:/usr/hdp/current/spark2-client/jars/spark-network-shuffle_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/zookeeper-3.4.6.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-repl_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/zstd-jni-1.3.2-2.jar:/usr/hdp/current/spark2-client/jars/spark-sketch_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-sql_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-streaming_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/2.6.5.179-4/hadoop/conf/

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:java.library.path=:/export/home/sobla/oracle_client/instantclient_19_5:/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib

20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.io.tmpdir=/tmp

20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.compiler=<NA>

20/05/26 15:38:16 INFO ZooKeeper: Client environment:os.name=Linux

20/05/26 15:38:16 INFO ZooKeeper: Client environment:os.arch=amd64

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:os.version=3.10.0-1062.9.1.el7.x86_64

20/05/26 15:38:16 INFO ZooKeeper: Client environment:user.name=svchdc36q

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:user.home=/home/svchdc36q

20/05/26 15:38:16 INFO ZooKeeper: Client
environment:user.dir=/home/svchdc36q

*20/05/26 15:38:16 INFO ZooKeeper: Initiating client connection,
connectString=server1.visa.com:2181
<http://server1.visa.com:2181>,server2.visa.com:2181
<http://server2.visa.com:2181>,server3.visa.com:2181
<http://server3.visa.com:2181> sessionTimeout=60000
watcher=org.apache.curator.ConnectionState@4ed31bc9*

*20/05/26 15:38:16 INFO ClientCnxn: Opening socket connection to server
server2.visa.com/x.x.x.x:2181 <http://server2.visa.com/x.x.x.x:2181>. Will
not attempt to authenticate using SASL (unknown error)*

*20/05/26 15:38:16 INFO ClientCnxn: Socket connection established,
initiating session, client: /x.x.x.x:36938, server:
server2.visa.com/x.x.x.x:2181 <http://server2.visa.com/x.x.x.x:2181>*

*20/05/26 15:38:16 INFO ClientCnxn: Session establishment complete on
server server2.visa.com/x.x.x.x:2181
<http://server2.visa.com/x.x.x.x:2181>, sessionid = 0x27234630fb51f5b,
negotiated timeout = 40000*

20/05/26 15:38:16 INFO ConnectionStateManager: State change: CONNECTED

20/05/26 15:38:17 INFO ZooKeeper: Session: 0x27234630fb51f5b closed

20/05/26 15:38:17 INFO ClientCnxn: EventThread shut down

20/05/26 15:38:17 INFO Utils: Resolved authority: server2.visa.com:10000

20/05/26 15:38:17 INFO HiveConnection: Will try to open client transport
with JDBC Uri: jdbc:hive2://
server2.visa.com:10000/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

20/05/26 15:38:19 INFO HoodieHiveClient: Successfully established Hive
connection to  jdbc:hive2://server1.visa.com:2181,server2.visa.com:2181,
server3.visa.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

20/05/26 15:38:19 INFO metastore: Trying to connect to metastore with URI
thrift://server2.visa.com:9083

20/05/26 15:38:19 INFO metastore: Opened a connection to metastore, current
connections: 1

20/05/26 15:38:19 INFO metastore: Connected to metastore.

20/05/26 15:38:19 INFO HiveSyncTool: Trying to sync hoodie table trr with
base path /projects/cdp/data/cdp_reporting/trr of type COPY_ON_WRITE

20/05/26 15:38:19 ERROR DBUtil$: [App] *********************** Exception
occurred in baseTableWrite for trr : Failed to check if table exists trr

org.apache.hudi.hive.HoodieHiveSyncException: Failed to check if table
exists trr

at
org.apache.hudi.hive.HoodieHiveClient.doesTableExist(HoodieHiveClient.java:459)

at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:91)

at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:67)

at
org.apache.hudi.HoodieSparkSqlWriter$.syncHive(HoodieSparkSqlWriter.scala:235)

at
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:169)

at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)

at
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)

at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)

at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)

at
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)

at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)

at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)

at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)

at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)

at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)

at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)

at
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)

at
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)

at
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)

at
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)

at
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)

at
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)

at
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)

at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)

at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)

at com.cybs.cdp.reporting.trr.DBUtil$.transactionTableWrite(DBUtil.scala:62)

at com.cybs.cdp.reporting.trr.TRREngine$.startEngine(TRREngine.scala:45)

at com.cybs.cdp.reporting.trr.TRREngine$.main(TRREngine.scala:23)

at com.cybs.cdp.reporting.trr.TRREngine.main(TRREngine.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)

at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: org.apache.thrift.TApplicationException: Invalid method name:
'get_table_req'

at
org.apache.thrift.TApplicationException.read(TApplicationException.java:111)

at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)

at
org.apache.hudi.org.apache.hadoop_hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table_req(ThriftHiveMetastore.java:1563)

at
org.apache.hudi.org.apache.hadoop_hive.metastore.api.ThriftHiveMetastore$Client.get_table_req(ThriftHiveMetastore.java:1550)

at
org.apache.hudi.org.apache.hadoop_hive.metastore.HiveMetaStoreClient.tableExists(HiveMetaStoreClient.java:1443)

at
org.apache.hudi.hive.HoodieHiveClient.doesTableExist(HoodieHiveClient.java:457)

... 38 more

Exception in thread "main" java.lang.Exception: Failed to check if table
exists trr

at com.cybs.cdp.reporting.trr.DBUtil$.transactionTableWrite(DBUtil.scala:69)

at com.cybs.cdp.reporting.trr.TRREngine$.startEngine(TRREngine.scala:45)

at com.cybs.cdp.reporting.trr.TRREngine$.main(TRREngine.scala:23)

at com.cybs.cdp.reporting.trr.TRREngine.main(TRREngine.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)

at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)



Thanks,

Selva



On Mon, Mar 2, 2020 at 10:02 AM Vinoth Chandar <vi...@apache.org> wrote:

> Hi Selva,
>
> See if this helps.
>
> https://lists.apache.org/thread.html/e1fd539ac438276dd7feb2bc813bf85f84a95f7f25b638488eb2e110%40%3Cdev.hudi.apache.org%3E
>
> Its long thread, but you can probably skim to the last few conversations
> around Hive 1.x
>
> Thanks
> Vinoth
>
> On Sun, Mar 1, 2020 at 5:26 PM selvaraj periyasamy <
> selvaraj.periyasamy1983@gmail.com> wrote:
>
> > Thanks Vinoth. We do have plan to move hive 2.x version in near future.
> > Can I get any info on the workaround for hive 1.x versions?
> >
> > Thanks,
> > Selva
> >
> > On Sun, Mar 1, 2020 at 3:19 PM Vinoth Chandar <vi...@apache.org> wrote:
> >
> > > We have dropped support for Hive 1.x, a while back. Would you be able
> to
> > > move to Hive 2.x?
> > >
> > > IIRC there were some workarounds discussed on this thread before. But,
> > > given the push towards Hive 3.x, its good to be on 2.x atleast ..
> > > Let me know and we can go from there :)
> > >
> > > On Sun, Mar 1, 2020 at 1:09 PM selvaraj periyasamy <
> > > selvaraj.periyasamy1983@gmail.com> wrote:
> > >
> > > > I am using Hudi 0.5.0 and then write using sparkwriter.
> > > >
> > > > My spark version is 2.3.0
> > > > Scala version 2.11.8
> > > > Hive version 1.2.2
> > > >
> > > > Write is success but hive call is failing. When checked some google
> > > > reference, It seems to be an hive client is higher version the
> server.
> > > > Since Hudi is built on hive 2.3.1, Is there a way to use 1.2.2?
> > > >
> > > > 2020-03-01 12:16:50 WARN  HoodieSparkSqlWriter$:110 - hoodie dataset
> at
> > > >
> hdfs://localhost:9000/projects/cdp/data/attunity_poc/attunity_rep_base
> > > > already exists. Deleting existing data & overwriting with new data.
> > > > [Stage 111:============================>
> > > >                                                          2020-03-01
> > > > 12:16:51 ERROR HiveConnection:697 - Error opening session
> > > > org.apache.thrift.TApplicationException: Required field
> > 'client_protocol'
> > > > is unset! Struct:TOpenSessionReq(client_protocol:null,
> > > >
> > > >
> > >
> >
> configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000,
> > > > use:database=default})
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
> > > > at
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hudi.org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:168)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hudi.org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:155)
> > > >
> > > >
> > > > Thanks,
> > > > Selva
> > > >
> > >
> >
>

Re: Hudi 0.5.0 -> Hive JDBC call fails

Posted by selvaraj periyasamy <se...@gmail.com>.
I opened Jira to track it .
https://github.com/apache/hudi/issues/1679

Thanks,
Selva

On Tue, May 26, 2020 at 9:52 AM selvaraj periyasamy <
selvaraj.periyasamy1983@gmail.com> wrote:

> Hi Vinoth,
>
>
>
> I see the below comment in Hudi code. How can I start using metastore
> client for hive registrations? is there a way to disable useJdbc flag?
>
>
>
> *// Support both JDBC and metastore based implementations for backwards
> compatiblity. Future users should// disable jdbc and depend on metastore
> client for all hive registrations*
>
>
>
> Below is my log. It makes hive JDBC connection and failing due to method
> not available error.
>
>
> 20/05/26 15:38:15 INFO HoodieSparkSqlWriter$: Syncing to Hive Metastore
> (URL: jdbc:hive2://server1.visa.com:2181,server2.visa.com:2181,
> server3.visa.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
> )
>
> 20/05/26 15:38:15 INFO FSUtils: Hadoop Configuration: fs.defaultFS:
> [hdfs://oprhqanameservice], Config:[Configuration: core-default.xml,
> core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml,
> yarn-site.xml, hdfs-default.xml, hdfs-site.xml, __spark_hadoop_conf__.xml,
> file:/etc/spark2/2.6.5.179-4/0/hive-site.xml], FileSystem:
> [DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1153590032_1, ugi=
> svchdc36q@VISA.COM (auth:KERBEROS)]]]
>
> 20/05/26 15:38:15 INFO HiveConf: Found configuration file
> file:/etc/spark2/2.6.5.179-4/0/hive-site.xml
>
> 20/05/26 15:38:16 INFO HoodieTableMetaClient: Loading
> HoodieTableMetaClient from /projects/cdp/data/cdp_reporting/trr
>
> 20/05/26 15:38:16 INFO FSUtils: Hadoop Configuration: fs.defaultFS:
> [hdfs://oprhqanameservice], Config:[Configuration: core-default.xml,
> core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml,
> yarn-site.xml, hdfs-default.xml, hdfs-site.xml, __spark_hadoop_conf__.xml,
> file:/etc/spark2/2.6.5.179-4/0/hive-site.xml], FileSystem:
> [DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1153590032_1, ugi=
> svchdc36q@VISA.COM (auth:KERBEROS)]]]
>
> 20/05/26 15:38:16 INFO HoodieTableConfig: Loading dataset properties from
> /projects/cdp/data/cdp_reporting/trr/.hoodie/hoodie.properties
>
> 20/05/26 15:38:16 INFO HoodieTableMetaClient: Finished Loading Table of
> type COPY_ON_WRITE from /projects/cdp/data/cdp_reporting/trr
>
> 20/05/26 15:38:16 INFO HoodieTableMetaClient: Loading Active commit
> timeline for /projects/cdp/data/cdp_reporting/trr
>
> 20/05/26 15:38:16 INFO HoodieActiveTimeline: Loaded instants
> java.util.stream.ReferencePipeline$Head@a1fca5a
>
> 20/05/26 15:38:16 INFO HoodieHiveClient: Creating hive connection
> jdbc:hive2://server1.visa.com:2181,server2.visa.com:2181,
> server3.visa.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
>
> 20/05/26 15:38:16 INFO Utils: Supplied authorities: server1.visa.com:2181,
> server2.visa.com:2181,server3.visa.com:2181
>
> 20/05/26 15:38:16 INFO CuratorFrameworkImpl: Starting
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:zookeeper.version=3.4.6-4--1, built on 08/09/2019 23:18 GMT
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:host.name=
> server4.visa.com
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.version=1.8.0_241
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.vendor=Oracle
> Corporation
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:java.home=/usr/java/jdk1.8.0_241-amd64/jre
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:java.class.path=/usr/hdp/current/spark2-client/conf/:/usr/hdp/current/spark2-client/jars/hk2-api-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/JavaEWAH-0.3.2.jar:/usr/hdp/current/spark2-client/jars/commons-pool-1.5.4.jar:/usr/hdp/current/spark2-client/jars/RoaringBitmap-0.5.11.jar:/usr/hdp/current/spark2-client/jars/hk2-locator-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/ST4-4.0.4.jar:/usr/hdp/current/spark2-client/jars/compress-lzf-1.0.3.jar:/usr/hdp/current/spark2-client/jars/activation-1.1.1.jar:/usr/hdp/current/spark2-client/jars/core-1.1.2.jar:/usr/hdp/current/spark2-client/jars/aircompressor-0.8.jar:/usr/hdp/current/spark2-client/jars/hk2-utils-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/antlr-2.7.7.jar:/usr/hdp/current/spark2-client/jars/curator-client-2.7.1.jar:/usr/hdp/current/spark2-client/jars/antlr-runtime-3.4.jar:/usr/hdp/current/spark2-client/jars/curator-framework-2.7.1.jar:/usr/hdp/current/spark2-client/jars/antlr4-runtime-4.7.jar:/usr/hdp/current/spark2-client/jars/ivy-2.4.0.jar:/usr/hdp/current/spark2-client/jars/aopalliance-1.0.jar:/usr/hdp/current/spark2-client/jars/commons-io-2.4.jar:/usr/hdp/current/spark2-client/jars/janino-3.0.8.jar:/usr/hdp/current/spark2-client/jars/aopalliance-repackaged-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/commons-collections-3.2.2.jar:/usr/hdp/current/spark2-client/jars/apache-log4j-extras-1.2.17.jar:/usr/hdp/current/spark2-client/jars/curator-recipes-2.7.1.jar:/usr/hdp/current/spark2-client/jars/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/spark2-client/jars/commons-cli-1.2.jar:/usr/hdp/current/spark2-client/jars/javax.inject-1.jar:/usr/hdp/current/spark2-client/jars/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/spark2-client/jars/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/current/spark2-client/jars/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/spark2-client/jars/datanucleus-core-3.2.10.jar:/usr/hdp/current/spark2-client/jars/api-util-1.0.0-M20.jar:/usr/hdp/current/spark2-client/jars/datanucleus-rdbms-3.2.9.jar:/usr/hdp/current/spark2-client/jars/arpack_combined_all-0.1.jar:/usr/hdp/current/spark2-client/jars/derby-10.12.1.1.jar:/usr/hdp/current/spark2-client/jars/arrow-format-0.8.0.jar:/usr/hdp/current/spark2-client/jars/eigenbase-properties-1.1.5.jar:/usr/hdp/current/spark2-client/jars/arrow-memory-0.8.0.jar:/usr/hdp/current/spark2-client/jars/flatbuffers-1.2.0-3f79e055.jar:/usr/hdp/current/spark2-client/jars/arrow-vector-0.8.0.jar:/usr/hdp/current/spark2-client/jars/hppc-0.7.2.jar:/usr/hdp/current/spark2-client/jars/avro-1.7.7.jar:/usr/hdp/current/spark2-client/jars/httpclient-4.5.2.jar:/usr/hdp/current/spark2-client/jars/avro-ipc-1.7.7.jar:/usr/hdp/current/spark2-client/jars/commons-compiler-3.0.8.jar:/usr/hdp/current/spark2-client/jars/avro-mapred-1.7.7-hadoop2.jar:/usr/hdp/current/spark2-client/jars/commons-compress-1.4.1.jar:/usr/hdp/current/spark2-client/jars/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/spark2-client/jars/guava-14.0.1.jar:/usr/hdp/current/spark2-client/jars/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/spark2-client/jars/gson-2.2.4.jar:/usr/hdp/current/spark2-client/jars/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/spark2-client/jars/commons-configuration-1.6.jar:/usr/hdp/current/spark2-client/jars/azure-data-lake-store-sdk-2.1.4.jar:/usr/hdp/current/spark2-client/jars/commons-lang-2.6.jar:/usr/hdp/current/spark2-client/jars/jpam-1.1.jar:/usr/hdp/current/spark2-client/jars/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/spark2-client/jars/guice-3.0.jar:/usr/hdp/current/spark2-client/jars/azure-storage-5.4.0.jar:/usr/hdp/current/spark2-client/jars/httpcore-4.4.4.jar:/usr/hdp/current/spark2-client/jars/base64-2.3.8.jar:/usr/hdp/current/spark2-client/jars/guice-servlet-3.0.jar:/usr/hdp/current/spark2-client/jars/bcprov-jdk15on-1.58.jar:/usr/hdp/current/spark2-client/jars/hadoop-aws-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/bonecp-0.8.0.RELEASE.jar:/usr/hdp/current/spark2-client/jars/commons-lang3-3.5.jar:/usr/hdp/current/spark2-client/jars/breeze-macros_2.11-0.13.2.jar:/usr/hdp/current/spark2-client/jars/hadoop-auth-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/breeze_2.11-0.13.2.jar:/usr/hdp/current/spark2-client/jars/commons-codec-1.10.jar:/usr/hdp/current/spark2-client/jars/jta-1.1.jar:/usr/hdp/current/spark2-client/jars/calcite-avatica-1.2.0-incubating.jar:/usr/hdp/current/spark2-client/jars/commons-logging-1.1.3.jar:/usr/hdp/current/spark2-client/jars/calcite-core-1.2.0-incubating.jar:/usr/hdp/current/spark2-client/jars/commons-math3-3.4.1.jar:/usr/hdp/current/spark2-client/jars/calcite-linq4j-1.2.0-incubating.jar:/usr/hdp/current/spark2-client/jars/hadoop-azure-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/chill-java-0.8.4.jar:/usr/hdp/current/spark2-client/jars/hadoop-client-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/chill_2.11-0.8.4.jar:/usr/hdp/current/spark2-client/jars/hadoop-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-beanutils-1.7.0.jar:/usr/hdp/current/spark2-client/jars/gcs-connector-1.8.1.2.6.5.179-4-shaded.jar:/usr/hdp/current/spark2-client/jars/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/spark2-client/jars/commons-crypto-1.0.0.jar:/usr/hdp/current/spark2-client/jars/hadoop-hdfs-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-dbcp-1.4.jar:/usr/hdp/current/spark2-client/jars/hive-beeline-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-digester-1.8.jar:/usr/hdp/current/spark2-client/jars/hive-cli-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/commons-httpclient-3.1.jar:/usr/hdp/current/spark2-client/jars/jackson-core-2.6.7.jar:/usr/hdp/current/spark2-client/jars/commons-net-2.2.jar:/usr/hdp/current/spark2-client/jars/javolution-5.5.1.jar:/usr/hdp/current/spark2-client/jars/jersey-server-2.22.2.jar:/usr/hdp/current/spark2-client/jars/xz-1.0.jar:/usr/hdp/current/spark2-client/jars/hadoop-annotations-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jets3t-0.9.4.jar:/usr/hdp/current/spark2-client/jars/okhttp-2.7.5.jar:/usr/hdp/current/spark2-client/jars/hadoop-azure-datalake-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-container-servlet-2.22.2.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-app-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jdo-api-3.0.1.jar:/usr/hdp/current/spark2-client/jars/libfb303-0.9.3.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-container-servlet-core-2.22.2.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-core-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-client-2.22.2.jar:/usr/hdp/current/spark2-client/jars/netty-3.9.9.Final.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-jobclient-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-common-2.22.2.jar:/usr/hdp/current/spark2-client/jars/netty-all-4.1.17.Final.jar:/usr/hdp/current/spark2-client/jars/hadoop-mapreduce-client-shuffle-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jetty-6.1.26.hwx.jar:/usr/hdp/current/spark2-client/jars/okio-1.6.0.jar:/usr/hdp/current/spark2-client/jars/hadoop-openstack-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-api-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-client-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jline-2.12.1.jar:/usr/hdp/current/spark2-client/jars/opencsv-2.3.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/joda-time-2.9.3.jar:/usr/hdp/current/spark2-client/jars/oro-2.0.8.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-registry-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-guava-2.22.2.jar:/usr/hdp/current/spark2-client/jars/paranamer-2.8.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-server-common-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jersey-media-jaxb-2.22.2.jar:/usr/hdp/current/spark2-client/jars/py4j-0.10.6.jar:/usr/hdp/current/spark2-client/jars/hadoop-yarn-server-web-proxy-2.7.3.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/json4s-ast_2.11-3.2.11.jar:/usr/hdp/current/spark2-client/jars/hive-exec-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/json4s-core_2.11-3.2.11.jar:/usr/hdp/current/spark2-client/jars/hive-jdbc-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/jodd-core-3.5.2.jar:/usr/hdp/current/spark2-client/jars/pyrolite-4.13.jar:/usr/hdp/current/spark2-client/jars/hive-metastore-1.21.2.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/json4s-jackson_2.11-3.2.11.jar:/usr/hdp/current/spark2-client/jars/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/spark2-client/jars/jsp-api-2.1.jar:/usr/hdp/current/spark2-client/jars/jackson-annotations-2.6.7.jar:/usr/hdp/current/spark2-client/jars/libthrift-0.9.3.jar:/usr/hdp/current/spark2-client/jars/jackson-core-asl-1.9.13.jar:/usr/hdp/current/spark2-client/jars/jsr305-1.3.9.jar:/usr/hdp/current/spark2-client/jars/jackson-databind-2.6.7.1.jar:/usr/hdp/current/spark2-client/jars/jtransforms-2.4.0.jar:/usr/hdp/current/spark2-client/jars/jackson-dataformat-cbor-2.6.7.jar:/usr/hdp/current/spark2-client/jars/log4j-1.2.17.jar:/usr/hdp/current/spark2-client/jars/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/spark2-client/jars/jul-to-slf4j-1.7.16.jar:/usr/hdp/current/spark2-client/jars/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/spark2-client/jars/kryo-shaded-3.0.3.jar:/usr/hdp/current/spark2-client/jars/jackson-module-paranamer-2.7.9.jar:/usr/hdp/current/spark2-client/jars/json-smart-1.3.1.jar:/usr/hdp/current/spark2-client/jars/scalap-2.11.8.jar:/usr/hdp/current/spark2-client/jars/jackson-module-scala_2.11-2.6.7.1.jar:/usr/hdp/current/spark2-client/jars/lz4-java-1.4.0.jar:/usr/hdp/current/spark2-client/jars/jackson-xc-1.9.13.jar:/usr/hdp/current/spark2-client/jars/machinist_2.11-0.6.1.jar:/usr/hdp/current/spark2-client/jars/java-xmlbuilder-1.1.jar:/usr/hdp/current/spark2-client/jars/macro-compat_2.11-1.1.1.jar:/usr/hdp/current/spark2-client/jars/javassist-3.18.1-GA.jar:/usr/hdp/current/spark2-client/jars/leveldbjni-all-1.8.jar:/usr/hdp/current/spark2-client/jars/javax.annotation-api-1.2.jar:/usr/hdp/current/spark2-client/jars/metrics-core-3.1.5.jar:/usr/hdp/current/spark2-client/jars/javax.inject-2.4.0-b34.jar:/usr/hdp/current/spark2-client/jars/metrics-graphite-3.1.5.jar:/usr/hdp/current/spark2-client/jars/javax.servlet-api-3.1.0.jar:/usr/hdp/current/spark2-client/jars/metrics-json-3.1.5.jar:/usr/hdp/current/spark2-client/jars/javax.ws.rs-api-2.0.1.jar:/usr/hdp/current/spark2-client/jars/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/current/spark2-client/jars/jaxb-api-2.2.2.jar:/usr/hdp/current/spark2-client/jars/metrics-jvm-3.1.5.jar:/usr/hdp/current/spark2-client/jars/jcip-annotations-1.0-1.jar:/usr/hdp/current/spark2-client/jars/minlog-1.3.0.jar:/usr/hdp/current/spark2-client/jars/jcl-over-slf4j-1.7.16.jar:/usr/hdp/current/spark2-client/jars/parquet-column-1.8.2.jar:/usr/hdp/current/spark2-client/jars/objenesis-2.1.jar:/usr/hdp/current/spark2-client/jars/spire_2.11-0.13.0.jar:/usr/hdp/current/spark2-client/jars/orc-core-1.4.3.2.6.5.179-4-nohive.jar:/usr/hdp/current/spark2-client/jars/stax-api-1.0-2.jar:/usr/hdp/current/spark2-client/jars/orc-mapreduce-1.4.3.2.6.5.179-4-nohive.jar:/usr/hdp/current/spark2-client/jars/osgi-resource-locator-1.0.1.jar:/usr/hdp/current/spark2-client/jars/parquet-common-1.8.2.jar:/usr/hdp/current/spark2-client/jars/parquet-encoding-1.8.2.jar:/usr/hdp/current/spark2-client/jars/parquet-format-2.3.1.jar:/usr/hdp/current/spark2-client/jars/parquet-hadoop-1.8.2.jar:/usr/hdp/current/spark2-client/jars/parquet-hadoop-bundle-1.6.0.jar:/usr/hdp/current/spark2-client/jars/parquet-jackson-1.8.2.jar:/usr/hdp/current/spark2-client/jars/protobuf-java-2.5.0.jar:/usr/hdp/current/spark2-client/jars/scala-compiler-2.11.8.jar:/usr/hdp/current/spark2-client/jars/scala-library-2.11.8.jar:/usr/hdp/current/spark2-client/jars/stax-api-1.0.1.jar:/usr/hdp/current/spark2-client/jars/scala-parser-combinators_2.11-1.0.4.jar:/usr/hdp/current/spark2-client/jars/scala-reflect-2.11.8.jar:/usr/hdp/current/spark2-client/jars/scala-xml_2.11-1.0.5.jar:/usr/hdp/current/spark2-client/jars/shapeless_2.11-2.3.2.jar:/usr/hdp/current/spark2-client/jars/slf4j-api-1.7.16.jar:/usr/hdp/current/spark2-client/jars/slf4j-log4j12-1.7.16.jar:/usr/hdp/current/spark2-client/jars/snappy-0.2.jar:/usr/hdp/current/spark2-client/jars/snappy-java-1.1.2.6.jar:/usr/hdp/current/spark2-client/jars/stream-2.7.0.jar:/usr/hdp/current/spark2-client/jars/spark-catalyst_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/stringtemplate-3.2.1.jar:/usr/hdp/current/spark2-client/jars/spark-cloud_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/super-csv-2.2.0.jar:/usr/hdp/current/spark2-client/jars/spark-core_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/univocity-parsers-2.5.9.jar:/usr/hdp/current/spark2-client/jars/spark-graphx_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-unsafe_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-hadoop-cloud_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-tags_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-hive-thriftserver_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/validation-api-1.1.0.Final.jar:/usr/hdp/current/spark2-client/jars/spark-hive_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/xbean-asm5-shaded-4.4.jar:/usr/hdp/current/spark2-client/jars/spark-kvstore_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/xercesImpl-2.9.1.jar:/usr/hdp/current/spark2-client/jars/spark-launcher_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-mllib-local_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/xmlenc-0.52.jar:/usr/hdp/current/spark2-client/jars/spark-mllib_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-yarn_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-network-common_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spire-macros_2.11-0.13.0.jar:/usr/hdp/current/spark2-client/jars/spark-network-shuffle_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/zookeeper-3.4.6.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-repl_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/zstd-jni-1.3.2-2.jar:/usr/hdp/current/spark2-client/jars/spark-sketch_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-sql_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/current/spark2-client/jars/spark-streaming_2.11-2.3.0.2.6.5.179-4.jar:/usr/hdp/2.6.5.179-4/hadoop/conf/
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:java.library.path=:/export/home/sobla/oracle_client/instantclient_19_5:/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.io.tmpdir=/tmp
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:java.compiler=<NA>
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:os.name=Linux
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:os.arch=amd64
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:os.version=3.10.0-1062.9.1.el7.x86_64
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client environment:user.name=svchdc36q
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:user.home=/home/svchdc36q
>
> 20/05/26 15:38:16 INFO ZooKeeper: Client
> environment:user.dir=/home/svchdc36q
>
> *20/05/26 15:38:16 INFO ZooKeeper: Initiating client connection,
> connectString=server1.visa.com:2181
> <http://server1.visa.com:2181>,server2.visa.com:2181
> <http://server2.visa.com:2181>,server3.visa.com:2181
> <http://server3.visa.com:2181> sessionTimeout=60000
> watcher=org.apache.curator.ConnectionState@4ed31bc9*
>
> *20/05/26 15:38:16 INFO ClientCnxn: Opening socket connection to server
> server2.visa.com/x.x.x.x:2181 <http://server2.visa.com/x.x.x.x:2181>. Will
> not attempt to authenticate using SASL (unknown error)*
>
> *20/05/26 15:38:16 INFO ClientCnxn: Socket connection established,
> initiating session, client: /x.x.x.x:36938, server:
> server2.visa.com/x.x.x.x:2181 <http://server2.visa.com/x.x.x.x:2181>*
>
> *20/05/26 15:38:16 INFO ClientCnxn: Session establishment complete on
> server server2.visa.com/x.x.x.x:2181
> <http://server2.visa.com/x.x.x.x:2181>, sessionid = 0x27234630fb51f5b,
> negotiated timeout = 40000*
>
> 20/05/26 15:38:16 INFO ConnectionStateManager: State change: CONNECTED
>
> 20/05/26 15:38:17 INFO ZooKeeper: Session: 0x27234630fb51f5b closed
>
> 20/05/26 15:38:17 INFO ClientCnxn: EventThread shut down
>
> 20/05/26 15:38:17 INFO Utils: Resolved authority: server2.visa.com:10000
>
> 20/05/26 15:38:17 INFO HiveConnection: Will try to open client transport
> with JDBC Uri: jdbc:hive2://
> server2.visa.com:10000/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
>
> 20/05/26 15:38:19 INFO HoodieHiveClient: Successfully established Hive
> connection to  jdbc:hive2://server1.visa.com:2181,server2.visa.com:2181,
> server3.visa.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
>
> 20/05/26 15:38:19 INFO metastore: Trying to connect to metastore with URI
> thrift://server2.visa.com:9083
>
> 20/05/26 15:38:19 INFO metastore: Opened a connection to metastore,
> current connections: 1
>
> 20/05/26 15:38:19 INFO metastore: Connected to metastore.
>
> 20/05/26 15:38:19 INFO HiveSyncTool: Trying to sync hoodie table trr with
> base path /projects/cdp/data/cdp_reporting/trr of type COPY_ON_WRITE
>
> 20/05/26 15:38:19 ERROR DBUtil$: [App] *********************** Exception
> occurred in baseTableWrite for trr : Failed to check if table exists trr
>
> org.apache.hudi.hive.HoodieHiveSyncException: Failed to check if table
> exists trr
>
> at
> org.apache.hudi.hive.HoodieHiveClient.doesTableExist(HoodieHiveClient.java:459)
>
> at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:91)
>
> at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:67)
>
> at
> org.apache.hudi.HoodieSparkSqlWriter$.syncHive(HoodieSparkSqlWriter.scala:235)
>
> at
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:169)
>
> at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
>
> at
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
>
> at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>
> at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>
> at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>
> at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>
> at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>
> at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
> at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>
> at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>
> at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>
> at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>
> at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
>
> at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
>
> at
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
>
> at
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
>
> at
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
>
> at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
>
> at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
>
> at
> com.cybs.cdp.reporting.trr.DBUtil$.transactionTableWrite(DBUtil.scala:62)
>
> at com.cybs.cdp.reporting.trr.TRREngine$.startEngine(TRREngine.scala:45)
>
> at com.cybs.cdp.reporting.trr.TRREngine$.main(TRREngine.scala:23)
>
> at com.cybs.cdp.reporting.trr.TRREngine.main(TRREngine.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)
>
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
>
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
>
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Caused by: org.apache.thrift.TApplicationException: Invalid method name:
> 'get_table_req'
>
> at
> org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
>
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
>
> at
> org.apache.hudi.org.apache.hadoop_hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table_req(ThriftHiveMetastore.java:1563)
>
> at
> org.apache.hudi.org.apache.hadoop_hive.metastore.api.ThriftHiveMetastore$Client.get_table_req(ThriftHiveMetastore.java:1550)
>
> at
> org.apache.hudi.org.apache.hadoop_hive.metastore.HiveMetaStoreClient.tableExists(HiveMetaStoreClient.java:1443)
>
> at
> org.apache.hudi.hive.HoodieHiveClient.doesTableExist(HoodieHiveClient.java:457)
>
> ... 38 more
>
> Exception in thread "main" java.lang.Exception: Failed to check if table
> exists trr
>
> at
> com.cybs.cdp.reporting.trr.DBUtil$.transactionTableWrite(DBUtil.scala:69)
>
> at com.cybs.cdp.reporting.trr.TRREngine$.startEngine(TRREngine.scala:45)
>
> at com.cybs.cdp.reporting.trr.TRREngine$.main(TRREngine.scala:23)
>
> at com.cybs.cdp.reporting.trr.TRREngine.main(TRREngine.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)
>
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
>
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
>
>
>
> Thanks,
>
> Selva
>
>
>
> On Mon, Mar 2, 2020 at 10:02 AM Vinoth Chandar <vi...@apache.org> wrote:
>
>> Hi Selva,
>>
>> See if this helps.
>>
>> https://lists.apache.org/thread.html/e1fd539ac438276dd7feb2bc813bf85f84a95f7f25b638488eb2e110%40%3Cdev.hudi.apache.org%3E
>>
>> Its long thread, but you can probably skim to the last few conversations
>> around Hive 1.x
>>
>> Thanks
>> Vinoth
>>
>> On Sun, Mar 1, 2020 at 5:26 PM selvaraj periyasamy <
>> selvaraj.periyasamy1983@gmail.com> wrote:
>>
>> > Thanks Vinoth. We do have plan to move hive 2.x version in near future.
>> > Can I get any info on the workaround for hive 1.x versions?
>> >
>> > Thanks,
>> > Selva
>> >
>> > On Sun, Mar 1, 2020 at 3:19 PM Vinoth Chandar <vi...@apache.org>
>> wrote:
>> >
>> > > We have dropped support for Hive 1.x, a while back. Would you be able
>> to
>> > > move to Hive 2.x?
>> > >
>> > > IIRC there were some workarounds discussed on this thread before. But,
>> > > given the push towards Hive 3.x, its good to be on 2.x atleast ..
>> > > Let me know and we can go from there :)
>> > >
>> > > On Sun, Mar 1, 2020 at 1:09 PM selvaraj periyasamy <
>> > > selvaraj.periyasamy1983@gmail.com> wrote:
>> > >
>> > > > I am using Hudi 0.5.0 and then write using sparkwriter.
>> > > >
>> > > > My spark version is 2.3.0
>> > > > Scala version 2.11.8
>> > > > Hive version 1.2.2
>> > > >
>> > > > Write is success but hive call is failing. When checked some google
>> > > > reference, It seems to be an hive client is higher version the
>> server.
>> > > > Since Hudi is built on hive 2.3.1, Is there a way to use 1.2.2?
>> > > >
>> > > > 2020-03-01 12:16:50 WARN  HoodieSparkSqlWriter$:110 - hoodie
>> dataset at
>> > > >
>> hdfs://localhost:9000/projects/cdp/data/attunity_poc/attunity_rep_base
>> > > > already exists. Deleting existing data & overwriting with new data.
>> > > > [Stage 111:============================>
>> > > >                                                          2020-03-01
>> > > > 12:16:51 ERROR HiveConnection:697 - Error opening session
>> > > > org.apache.thrift.TApplicationException: Required field
>> > 'client_protocol'
>> > > > is unset! Struct:TOpenSessionReq(client_protocol:null,
>> > > >
>> > > >
>> > >
>> >
>> configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000,
>> > > > use:database=default})
>> > > > at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
>> > > > at
>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
>> > > > at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hudi.org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:168)
>> > > > at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hudi.org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:155)
>> > > >
>> > > >
>> > > > Thanks,
>> > > > Selva
>> > > >
>> > >
>> >
>>
>