You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/05/16 08:41:11 UTC
[GitHub] [hudi] dongkelun opened a new pull request, #5592: [HUDI-4103] TestCreateTable failed CTAS when indicating hoodie.database.name in table properties
dongkelun opened a new pull request, #5592:
URL: https://github.com/apache/hudi/pull/5592
## *Tips*
- *Thank you very much for contributing to Apache Hudi.*
- *Please review https://hudi.apache.org/contribute/how-to-contribute before opening a pull request.*
## What is the purpose of the pull request
*(For example: This pull request adds quick-start document.)*
## Brief change log
*(for example:)*
- *Modify AnnotationLocation checkstyle rule in checkstyle.xml*
## Verify this pull request
*(Please pick either of the following options)*
This pull request is a trivial rework / code cleanup without any test coverage.
*(or)*
This pull request is already covered by existing tests, such as *(please describe tests)*.
(or)
This change added tests and can be verified as follows:
*(example:)*
- *Added integration tests for end-to-end.*
- *Added HoodieClientWriteTest to verify the change.*
- *Manually verified the change by running a job locally.*
## Committer checklist
- [ ] Has a corresponding JIRA in PR title & commit
- [ ] Commit message is descriptive of the change
- [ ] CI is green
- [ ] Necessary doc changes done or have another open PR
- [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] jinxing64 commented on a diff in pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
jinxing64 commented on code in PR #5592:
URL: https://github.com/apache/hudi/pull/5592#discussion_r873487584
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestCreateTable.scala:
##########
@@ -383,80 +383,86 @@ class TestCreateTable extends HoodieSparkSqlTestBase {
}
test("Test Create Table As Select With Tblproperties For Filter Props") {
Review Comment:
Hi @dongkelun
Thanks for this fast fix, I run this patch in my local, but this test still complains "org.apache.hudi.exception.HoodieException: Config conflict(key current value existing value):
hoodie.database.name: databaseName default"
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] jinxing64 commented on a diff in pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
jinxing64 commented on code in PR #5592:
URL: https://github.com/apache/hudi/pull/5592#discussion_r873495466
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestCreateTable.scala:
##########
@@ -383,80 +383,86 @@ class TestCreateTable extends HoodieSparkSqlTestBase {
}
test("Test Create Table As Select With Tblproperties For Filter Props") {
Review Comment:
Thanks a lot for efforts~
I can reproduce from my local, the stacktrace is as below
```
/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/bin/java "-javaagent:/Applications/IntelliJ IDEA CE.app/Contents/lib/idea_rt.jar=58989:/Applications/IntelliJ IDEA CE.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath "/Users/jinxing/Library/Application Support/JetBrains/IdeaIC2021.2/plugins/Scala/lib/runners.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jd
k/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/plugin.jar:/Libr
ary/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_281.jdk/Contents/Home/lib/tools.jar:/Users/jinxing/workspace/hudi/hudi-spark-datasource/hudi-spark/target/test-classes:/Users/jinxing/workspace/hudi/hudi-spark-datasource/hudi-spark/target/classes:/Users/jinxing/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.12.12.jar:/Users/jinxing/.ivy2/
cache/org.scala-lang/scala-library/jars/scala-library-2.12.12.jar:/Users/jinxing/.m2/repository/org/scala-lang/scala-library/2.12.10/scala-library-2.12.10.jar:/Users/jinxing/workspace/hudi/hudi-client/hudi-client-common/target/classes:/Users/jinxing/workspace/hudi/hudi-aws/target/classes:/Users/jinxing/.m2/repository/com/amazonaws/dynamodb-lock-client/1.1.0/dynamodb-lock-client-1.1.0.jar:/Users/jinxing/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.12.22/aws-java-sdk-cloudwatch-1.12.22.jar:/Users/jinxing/.m2/repository/com/amazonaws/jmespath-java/1.12.22/jmespath-java-1.12.22.jar:/Users/jinxing/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.12.22/aws-java-sdk-dynamodb-1.12.22.jar:/Users/jinxing/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.12.22/aws-java-sdk-s3-1.12.22.jar:/Users/jinxing/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.12.22/aws-java-sdk-kms-1.12.22.jar:/Users/jinxing/.m2/repository/com/amazonaws/aws-java-sdk-core/1.12.22/aws-java-sdk-core-1.12.22.jar:
/Users/jinxing/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/Users/jinxing/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.12.3/jackson-dataformat-cbor-2.12.3.jar:/Users/jinxing/.m2/repository/com/amazonaws/aws-java-sdk-glue/1.12.22/aws-java-sdk-glue-1.12.22.jar:/Users/jinxing/workspace/hudi/hudi-timeline-service/target/classes:/Users/jinxing/.m2/repository/io/javalin/javalin/2.8.0/javalin-2.8.0.jar:/Users/jinxing/.m2/repository/org/jetbrains/kotlin/kotlin-stdlib-jdk8/1.2.71/kotlin-stdlib-jdk8-1.2.71.jar:/Users/jinxing/.m2/repository/org/jetbrains/kotlin/kotlin-stdlib/1.2.71/kotlin-stdlib-1.2.71.jar:/Users/jinxing/.m2/repository/org/jetbrains/kotlin/kotlin-stdlib-common/1.2.71/kotlin-stdlib-common-1.2.71.jar:/Users/jinxing/.m2/repository/org/jetbrains/kotlin/kotlin-stdlib-jdk7/1.2.71/kotlin-stdlib-jdk7-1.2.71.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-server/9.4.15.v20190215/jetty-server-9.4.15.v20190215.jar:/Users/jin
xing/.m2/repository/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-http/9.4.15.v20190215/jetty-http-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-util/9.4.15.v20190215/jetty-util-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-io/9.4.15.v20190215/jetty-io-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-webapp/9.4.15.v20190215/jetty-webapp-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-xml/9.4.15.v20190215/jetty-xml-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-servlet/9.4.15.v20190215/jetty-servlet-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-security/9.4.15.v20190215/jetty-security-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/websocket/websocket-server/9.4.15.v20190215/websocket-server-9.4.15.v20190215.jar:/Users/jinxin
g/.m2/repository/org/eclipse/jetty/websocket/websocket-common/9.4.15.v20190215/websocket-common-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/websocket/websocket-api/9.4.15.v20190215/websocket-api-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/websocket/websocket-client/9.4.15.v20190215/websocket-client-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/jetty-client/9.4.15.v20190215/jetty-client-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/org/eclipse/jetty/websocket/websocket-servlet/9.4.15.v20190215/websocket-servlet-9.4.15.v20190215.jar:/Users/jinxing/.m2/repository/joda-time/joda-time/2.9.9/joda-time-2.9.9.jar:/Users/jinxing/.m2/repository/com/github/davidmoten/hilbert-curve/0.2.2/hilbert-curve-0.2.2.jar:/Users/jinxing/.m2/repository/com/github/davidmoten/guava-mini/0.1.3/guava-mini-0.1.3.jar:/Users/jinxing/.m2/repository/io/dropwizard/metrics/metrics-graphite/4.1.1/metrics-graphite-4.1.1.jar:/Users/jinxing/.m2
/repository/io/dropwizard/metrics/metrics-core/4.1.1/metrics-core-4.1.1.jar:/Users/jinxing/.m2/repository/io/dropwizard/metrics/metrics-jmx/4.1.1/metrics-jmx-4.1.1.jar:/Users/jinxing/.m2/repository/io/prometheus/simpleclient/0.8.0/simpleclient-0.8.0.jar:/Users/jinxing/.m2/repository/io/prometheus/simpleclient_httpserver/0.8.0/simpleclient_httpserver-0.8.0.jar:/Users/jinxing/.m2/repository/io/prometheus/simpleclient_common/0.8.0/simpleclient_common-0.8.0.jar:/Users/jinxing/.m2/repository/io/prometheus/simpleclient_dropwizard/0.8.0/simpleclient_dropwizard-0.8.0.jar:/Users/jinxing/.m2/repository/io/prometheus/simpleclient_pushgateway/0.8.0/simpleclient_pushgateway-0.8.0.jar:/Users/jinxing/workspace/hudi/hudi-client/hudi-spark-client/target/classes:/Users/jinxing/workspace/hudi/hudi-common/target/classes:/Users/jinxing/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.0/jackson-databind-2.10.0.jar:/Users/jinxing/.m2/repository/com/github/ben-manes/caffeine/caffeine/2.9.1/c
affeine-2.9.1.jar:/Users/jinxing/.m2/repository/org/checkerframework/checker-qual/3.10.0/checker-qual-3.10.0.jar:/Users/jinxing/.m2/repository/com/google/errorprone/error_prone_annotations/2.5.1/error_prone_annotations-2.5.1.jar:/Users/jinxing/.m2/repository/org/apache/orc/orc-core/1.6.0/orc-core-1.6.0-nohive.jar:/Users/jinxing/.m2/repository/org/apache/orc/orc-shims/1.6.0/orc-shims-1.6.0.jar:/Users/jinxing/.m2/repository/io/airlift/aircompressor/0.15/aircompressor-0.15.jar:/Users/jinxing/.m2/repository/org/jetbrains/annotations/17.0.0/annotations-17.0.0.jar:/Users/jinxing/.m2/repository/org/apache/httpcomponents/fluent-hc/4.4.1/fluent-hc-4.4.1.jar:/Users/jinxing/.m2/repository/org/apache/httpcomponents/httpclient/4.4.1/httpclient-4.4.1.jar:/Users/jinxing/.m2/repository/org/rocksdb/rocksdbjni/5.17.2/rocksdbjni-5.17.2.jar:/Users/jinxing/.m2/repository/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/Users/jinxing/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.
3.0.jar:/Users/jinxing/.m2/repository/org/objenesis/objenesis/2.5.1/objenesis-2.5.1.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-client/2.4.9/hbase-client-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/thirdparty/hbase-shaded-protobuf/3.5.1/hbase-shaded-protobuf-3.5.1.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-common/2.4.9/hbase-common-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-logging/2.4.9/hbase-logging-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/thirdparty/hbase-shaded-gson/3.5.1/hbase-shaded-gson-3.5.1.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-hadoop-compat/2.4.9/hbase-hadoop-compat-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-hadoop2-compat/2.4.9/hbase-hadoop2-compat-2.4.9.jar:/Users/jinxing/.m2/repository/javax/activation/javax.activation-api/1.2.0/javax.activation-api-1.2.0.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-protocol-shaded/2.4.9/hbase-protocol-shaded-2.
4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-protocol/2.4.9/hbase-protocol-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/thirdparty/hbase-shaded-miscellaneous/3.5.1/hbase-shaded-miscellaneous-3.5.1.jar:/Users/jinxing/.m2/repository/org/apache/hbase/thirdparty/hbase-shaded-netty/3.5.1/hbase-shaded-netty-3.5.1.jar:/Users/jinxing/.m2/repository/org/jruby/jcodings/jcodings/1.0.55/jcodings-1.0.55.jar:/Users/jinxing/.m2/repository/org/jruby/joni/joni/2.1.31/joni-2.1.31.jar:/Users/jinxing/.m2/repository/org/apache/yetus/audience-annotations/0.5.0/audience-annotations-0.5.0.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-server/2.4.9/hbase-server-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-http/2.4.9/hbase-http-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/thirdparty/hbase-shaded-jetty/3.5.1/hbase-shaded-jetty-3.5.1.jar:/Users/jinxing/.m2/repository/org/apache/hbase/thirdparty/hbase-shaded-jersey/3.5.1/hbase-shaded-jersey-3.5
.1.jar:/Users/jinxing/.m2/repository/jakarta/validation/jakarta.validation-api/2.0.2/jakarta.validation-api-2.0.2.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-procedure/2.4.9/hbase-procedure-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-zookeeper/2.4.9/hbase-zookeeper-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-replication/2.4.9/hbase-replication-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-metrics-api/2.4.9/hbase-metrics-api-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-metrics/2.4.9/hbase-metrics-2.4.9.jar:/Users/jinxing/.m2/repository/org/apache/hbase/hbase-asyncfs/2.4.9/hbase-asyncfs-2.4.9.jar:/Users/jinxing/.m2/repository/org/glassfish/web/javax.servlet.jsp/2.3.2/javax.servlet.jsp-2.3.2.jar:/Users/jinxing/.m2/repository/org/glassfish/javax.el/3.0.1-b12/javax.el-3.0.1-b12.jar:/Users/jinxing/.m2/repository/javax/servlet/jsp/javax.servlet.jsp-api/2.3.1/javax.servlet.jsp-api-2.3.1.jar:/Users/jinxing/
.m2/repository/org/jamon/jamon-runtime/2.4.1/jamon-runtime-2.4.1.jar:/Users/jinxing/.m2/repository/com/lmax/disruptor/3.4.2/disruptor-3.4.2.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-distcp/2.10.0/hadoop-distcp-2.10.0.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.10.1/hadoop-hdfs-2.10.1.jar:/Users/jinxing/.m2/repository/org/lz4/lz4-java/1.8.0/lz4-java-1.8.0.jar:/Users/jinxing/workspace/hudi/hudi-hadoop-mr/target/classes:/Users/jinxing/workspace/hudi/hudi-sync/hudi-hive-sync/target/classes:/Users/jinxing/.m2/repository/com/beust/jcommander/1.72/jcommander-1.72.jar:/Users/jinxing/workspace/hudi/hudi-sync/hudi-sync-common/target/classes:/Users/jinxing/workspace/hudi/hudi-spark-datasource/hudi-spark-common/target/classes:/Users/jinxing/workspace/hudi/hudi-spark-datasource/hudi-spark3/target/classes:/Users/jinxing/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.10.0/jackson-core-2.10.0.jar:/Users/jinxing/.m2/repository/org/json4s/json4s-jackso
n_2.12/3.7.0-M11/json4s-jackson_2.12-3.7.0-M11.jar:/Users/jinxing/.m2/repository/org/json4s/json4s-core_2.12/3.7.0-M11/json4s-core_2.12-3.7.0-M11.jar:/Users/jinxing/.m2/repository/org/json4s/json4s-ast_2.12/3.7.0-M11/json4s-ast_2.12-3.7.0-M11.jar:/Users/jinxing/.m2/repository/org/json4s/json4s-scalap_2.12/3.7.0-M11/json4s-scalap_2.12-3.7.0-M11.jar:/Users/jinxing/workspace/hudi/hudi-spark-datasource/hudi-spark3-common/target/classes:/Users/jinxing/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/jinxing/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.10.0/jackson-annotations-2.10.0.jar:/Users/jinxing/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.10.0/jackson-module-scala_2.12-2.10.0.jar:/Users/jinxing/.m2/repository/com/fasterxml/jackson/module/jackson-module-paranamer/2.10.0/jackson-module-paranamer-2.10.0.jar:/Users/jinxing/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/Users/jinxing/.m2/repository/org/a
pache/avro/avro/1.10.2/avro-1.10.2.jar:/Users/jinxing/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-avro/1.12.2/parquet-avro-1.12.2.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-column/1.12.2/parquet-column-1.12.2.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-common/1.12.2/parquet-common-1.12.2.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-encoding/1.12.2/parquet-encoding-1.12.2.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-hadoop/1.12.2/parquet-hadoop-1.12.2.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-jackson/1.12.2/parquet-jackson-1.12.2.jar:/Users/jinxing/.m2/repository/org/apache/parquet/parquet-format-structures/1.12.2/parquet-format-structures-1.12.2.jar:/Users/jinxing/.m2/repository/javax/annotation/javax.annotation-api/1.3.2/javax.annotation-api-1.3.2.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-
core_2.12/3.2.1/spark-core_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/avro/avro-mapred/1.10.2/avro-mapred-1.10.2.jar:/Users/jinxing/.m2/repository/org/apache/avro/avro-ipc/1.10.2/avro-ipc-1.10.2.jar:/Users/jinxing/.m2/repository/org/tukaani/xz/1.8/xz-1.8.jar:/Users/jinxing/.m2/repository/com/twitter/chill_2.12/0.10.0/chill_2.12-0.10.0.jar:/Users/jinxing/.m2/repository/com/twitter/chill-java/0.10.0/chill-java-0.10.0.jar:/Users/jinxing/.m2/repository/org/apache/xbean/xbean-asm9-shaded/4.20/xbean-asm9-shaded-4.20.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-client-api/3.3.1/hadoop-client-api-3.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-client-runtime/3.3.1/hadoop-client-runtime-3.3.1.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-launcher_2.12/3.2.1/spark-launcher_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-kvstore_2.12/3.2.1/spark-kvstore_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark
-network-common_2.12/3.2.1/spark-network-common_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/com/google/crypto/tink/tink/1.6.0/tink-1.6.0.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-network-shuffle_2.12/3.2.1/spark-network-shuffle_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-unsafe_2.12/3.2.1/spark-unsafe_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/Users/jinxing/.m2/repository/org/apache/zookeeper/zookeeper/3.6.2/zookeeper-3.6.2.jar:/Users/jinxing/.m2/repository/org/apache/zookeeper/zookeeper-jute/3.6.2/zookeeper-jute-3.6.2.jar:/Users/jinxing/.m2/repository/jakarta/servlet/jakarta.servlet-api/4.0.3/jakarta.servlet-api-4.0.3.jar:/Users/jinxing/.m2/repository/commons-codec/commons-codec/1.15/commons-codec-1.15.jar:/Users/jinxing/.m2/repository/org/apache/commons/commons-lang3/3.12.0/commons-lang3-3.12.0.jar:/Users/jinxing/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.
1.jar:/Users/jinxing/.m2/repository/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/Users/jinxing/.m2/repository/commons-io/commons-io/2.8.0/commons-io-2.8.0.jar:/Users/jinxing/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/jinxing/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/Users/jinxing/.m2/repository/org/slf4j/jul-to-slf4j/1.7.30/jul-to-slf4j-1.7.30.jar:/Users/jinxing/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.30/jcl-over-slf4j-1.7.30.jar:/Users/jinxing/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar:/Users/jinxing/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/Users/jinxing/.m2/repository/org/xerial/snappy/snappy-java/1.1.8.4/snappy-java-1.1.8.4.jar:/Users/jinxing/.m2/repository/com/github/luben/zstd-jni/1.5.0-4/zstd-jni-1.5.0-4.jar:/Users/jinxing/.m2/repository/org/roaringbitmap/RoaringBitmap/0.9.0/RoaringBitmap-0.9.0.jar:/Users/jinxing/.m2/reposit
ory/org/roaringbitmap/shims/0.9.0/shims-0.9.0.jar:/Users/jinxing/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/jinxing/.m2/repository/org/scala-lang/modules/scala-xml_2.12/1.2.0/scala-xml_2.12-1.2.0.jar:/Users/jinxing/.m2/repository/org/scala-lang/scala-reflect/2.12.15/scala-reflect-2.12.15.jar:/Users/jinxing/.m2/repository/org/glassfish/jersey/core/jersey-client/2.34/jersey-client-2.34.jar:/Users/jinxing/.m2/repository/jakarta/ws/rs/jakarta.ws.rs-api/2.1.6/jakarta.ws.rs-api-2.1.6.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/external/jakarta.inject/2.6.1/jakarta.inject-2.6.1.jar:/Users/jinxing/.m2/repository/org/glassfish/jersey/core/jersey-common/2.34/jersey-common-2.34.jar:/Users/jinxing/.m2/repository/jakarta/annotation/jakarta.annotation-api/1.3.5/jakarta.annotation-api-1.3.5.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.3/osgi-resource-locator-1.0.3.jar:/Users/jinxing/.m2/repository/org/glassfish/jersey/core/jersey-
server/2.17/jersey-server-2.17.jar:/Users/jinxing/.m2/repository/javax/ws/rs/javax.ws.rs-api/2.0.1/javax.ws.rs-api-2.0.1.jar:/Users/jinxing/.m2/repository/org/glassfish/jersey/media/jersey-media-jaxb/2.17/jersey-media-jaxb-2.17.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/hk2-api/2.4.0-b10/hk2-api-2.4.0-b10.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/hk2-utils/2.4.0-b10/hk2-utils-2.4.0-b10.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.4.0-b10/aopalliance-repackaged-2.4.0-b10.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/external/javax.inject/2.4.0-b10/javax.inject-2.4.0-b10.jar:/Users/jinxing/.m2/repository/org/glassfish/hk2/hk2-locator/2.4.0-b10/hk2-locator-2.4.0-b10.jar:/Users/jinxing/.m2/repository/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar:/Users/jinxing/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.34/jersey-container-servlet-2.34.jar:/Users/jinxing/.m2/repos
itory/org/glassfish/jersey/containers/jersey-container-servlet-core/2.17/jersey-container-servlet-core-2.17.jar:/Users/jinxing/.m2/repository/org/glassfish/jersey/inject/jersey-hk2/2.34/jersey-hk2-2.34.jar:/Users/jinxing/.m2/repository/org/javassist/javassist/3.25.0-GA/javassist-3.25.0-GA.jar:/Users/jinxing/.m2/repository/io/netty/netty-all/4.1.68.Final/netty-all-4.1.68.Final.jar:/Users/jinxing/.m2/repository/com/clearspring/analytics/stream/2.9.6/stream-2.9.6.jar:/Users/jinxing/.m2/repository/io/dropwizard/metrics/metrics-jvm/4.2.0/metrics-jvm-4.2.0.jar:/Users/jinxing/.m2/repository/io/dropwizard/metrics/metrics-json/4.2.0/metrics-json-4.2.0.jar:/Users/jinxing/.m2/repository/org/apache/ivy/ivy/2.5.0/ivy-2.5.0.jar:/Users/jinxing/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/Users/jinxing/.m2/repository/net/razorvine/pyrolite/4.30/pyrolite-4.30.jar:/Users/jinxing/.m2/repository/net/sf/py4j/py4j/0.10.9.3/py4j-0.10.9.3.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-tags_2.12/3.
2.1/spark-tags_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/commons/commons-crypto/1.1.0/commons-crypto-1.1.0.jar:/Users/jinxing/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-sql_2.12/3.2.1/spark-sql_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/com/univocity/univocity-parsers/2.9.1/univocity-parsers-2.9.1.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-sketch_2.12/3.2.1/spark-sketch_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-catalyst_2.12/3.2.1/spark-catalyst_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/orc/orc-core/1.6.12/orc-core-1.6.12.jar:/Users/jinxing/.m2/repository/org/threeten/threeten-extra/1.5.0/threeten-extra-1.5.0.jar:/Users/jinxing/.m2/repository/org/apache/orc/orc-mapreduce/1.6.12/orc-mapreduce-1.6.12.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-storage-api/2.7.2/hive-storage-api-2.7.2.jar:/Users/jinxing/.m2/repository/org/apache
/spark/spark-hive_2.12/3.2.1/spark-hive_2.12-3.2.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-serde/2.3.1/hive-serde-2.3.1.jar:/Users/jinxing/.m2/repository/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-shims/2.3.1/hive-shims-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/shims/hive-shims-common/2.3.1/hive-shims-common-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/shims/hive-shims-0.23/2.3.1/hive-shims-0.23-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/shims/hive-shims-scheduler/2.3.1/hive-shims-scheduler-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-llap-common/2.3.9/hive-llap-common-2.3.9.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-llap-client/2.3.9/hive-llap-client-2.3.9.jar:/Users/jinxing/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/jinxing/.m2/repository/org/jodd/jodd-core/3.5.2/jodd-core-3.5.2.jar:/Users/jinxin
g/.m2/repository/org/datanucleus/datanucleus-core/4.1.17/datanucleus-core-4.1.17.jar:/Users/jinxing/.m2/repository/org/apache/thrift/libthrift/0.12.0/libthrift-0.12.0.jar:/Users/jinxing/.m2/repository/org/apache/thrift/libfb303/0.9.3/libfb303-0.9.3.jar:/Users/jinxing/.m2/repository/org/apache/derby/derby/10.14.2.0/derby-10.14.2.0.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-sql_2.12/3.2.1/spark-sql_2.12-3.2.1-tests.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-core_2.12/3.2.1/spark-core_2.12-3.2.1-tests.jar:/Users/jinxing/.m2/repository/org/apache/spark/spark-catalyst_2.12/3.2.1/spark-catalyst_2.12-3.2.1-tests.jar:/Users/jinxing/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.12/1.1.2/scala-parser-combinators_2.12-1.1.2.jar:/Users/jinxing/.m2/repository/org/codehaus/janino/janino/3.0.16/janino-3.0.16.jar:/Users/jinxing/.m2/repository/org/codehaus/janino/commons-compiler/3.0.16/commons-compiler-3.0.16.jar:/Users/jinxing/.m2/repository/org/antlr
/antlr4-runtime/4.8/antlr4-runtime-4.8.jar:/Users/jinxing/.m2/repository/javax/xml/bind/jaxb-api/2.2.11/jaxb-api-2.2.11.jar:/Users/jinxing/.m2/repository/org/apache/arrow/arrow-vector/2.0.0/arrow-vector-2.0.0.jar:/Users/jinxing/.m2/repository/org/apache/arrow/arrow-format/2.0.0/arrow-format-2.0.0.jar:/Users/jinxing/.m2/repository/org/apache/arrow/arrow-memory-core/2.0.0/arrow-memory-core-2.0.0.jar:/Users/jinxing/.m2/repository/com/google/flatbuffers/flatbuffers-java/1.9.0/flatbuffers-java-1.9.0.jar:/Users/jinxing/.m2/repository/org/apache/arrow/arrow-memory-netty/2.0.0/arrow-memory-netty-2.0.0.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-client/2.10.1/hadoop-client-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-hdfs-client/2.10.1/hadoop-hdfs-client-2.10.1.jar:/Users/jinxing/.m2/repository/com/squareup/okhttp/okhttp/2.7.5/okhttp-2.7.5.jar:/Users/jinxing/.m2/repository/com/squareup/okio/okio/1.6.0/okio-1.6.0.jar:/Users/jinxing/.m2/repository/org/apache
/hadoop/hadoop-mapreduce-client-app/2.10.1/hadoop-mapreduce-client-app-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.10.1/hadoop-mapreduce-client-common-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.10.1/hadoop-mapreduce-client-shuffle-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.10.1/hadoop-yarn-server-common-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-yarn-registry/2.10.1/hadoop-yarn-registry-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/geronimo/specs/geronimo-jcache_1.0_spec/1.0-alpha-1/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/Users/jinxing/.m2/repository/org/ehcache/ehcache/3.3.1/ehcache-3.3.1.jar:/Users/jinxing/.m2/repository/com/zaxxer/HikariCP-java7/2.4.12/HikariCP-java7-2.4.12.jar:/Users/jinxing/.m2/repository/com/microsoft/sqlserver/mssql-jdbc/6.2.1.jre7/mssql-jdbc-6.2.1.jre7.jar:/Users/jinxing/.m2/repository/or
g/apache/hadoop/hadoop-yarn-api/2.10.1/hadoop-yarn-api-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.10.1/hadoop-mapreduce-client-core-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.10.1/hadoop-yarn-client-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.10.1/hadoop-yarn-common-2.10.1.jar:/Users/jinxing/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.10.1/hadoop-mapreduce-client-jobclient-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-annotations/2.10.1/hadoop-annotations-2.10.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-common/2.10.1/hadoop-common-2.10.1.jar:/Users/jinxing/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/jinxing/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/jinxing/.m2/reposito
ry/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/jinxing/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/jinxing/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/jinxing/.m2/repository/org/mortbay/jetty/jetty-sslengine/6.1.26/jetty-sslengine-6.1.26.jar:/Users/jinxing/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/jinxing/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/jinxing/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/jinxing/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/jinxing/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar:/Users/jinxing/.m2/repository/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar:/Users/jinxing/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/jinxing/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/jinxing/.m2/repository/commons
-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/jinxing/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/jinxing/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/jinxing/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/jinxing/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/jinxing/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/jinxing/.m2/repository/commons-beanutils/commons-beanutils/1.9.4/commons-beanutils-1.9.4.jar:/Users/jinxing/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/jinxing/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/jinxing/.m2/repository/com/google/code/gson/gson/2.3.1/gson-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-auth/2.10.1/hadoop-auth-2.10.1.jar:/Users/jinxing/.m2/
repository/com/nimbusds/nimbus-jose-jwt/7.9/nimbus-jose-jwt-7.9.jar:/Users/jinxing/.m2/repository/com/github/stephenc/jcip/jcip-annotations/1.0-1/jcip-annotations-1.0-1.jar:/Users/jinxing/.m2/repository/net/minidev/json-smart/2.3/json-smart-2.3.jar:/Users/jinxing/.m2/repository/net/minidev/accessors-smart/1.2/accessors-smart-1.2.jar:/Users/jinxing/.m2/repository/org/ow2/asm/asm/5.0.4/asm-5.0.4.jar:/Users/jinxing/.m2/repository/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/Users/jinxing/.m2/repository/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/Users/jinxing/.m2/repository/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/Users/jinxing/.m2/repository/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/Users/jinxing/.m2/repository/com/jcraft/jsch/0.1.55/jsch-0.1.55.jar:/Users/jinxing/.m2/repository/org/apache/htrace/htrace-core4/4.1.0-incubating/htrace-
core4-4.1.0-incubating.jar:/Users/jinxing/.m2/repository/org/codehaus/woodstox/stax2-api/3.1.4/stax2-api-3.1.4.jar:/Users/jinxing/.m2/repository/com/fasterxml/woodstox/woodstox-core/5.0.3/woodstox-core-5.0.3.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-exec/2.3.1/hive-exec-2.3.1-core.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-vector-code-gen/2.3.1/hive-vector-code-gen-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/velocity/velocity/1.5/velocity-1.5.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-llap-tez/2.3.1/hive-llap-tez-2.3.1.jar:/Users/jinxing/.m2/repository/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar:/Users/jinxing/.m2/repository/org/antlr/antlr-runtime/3.5.2/antlr-runtime-3.5.2.jar:/Users/jinxing/.m2/repository/org/antlr/ST4/4.0.4/ST4-4.0.4.jar:/Users/jinxing/.m2/repository/org/apache/ant/ant/1.9.1/ant-1.9.1.jar:/Users/jinxing/.m2/repository/org/apache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar:/Users/jinxing/.m
2/repository/org/codehaus/groovy/groovy-all/2.4.4/groovy-all-2.4.4.jar:/Users/jinxing/.m2/repository/org/apache/calcite/calcite-core/1.10.0/calcite-core-1.10.0.jar:/Users/jinxing/.m2/repository/org/apache/calcite/calcite-linq4j/1.10.0/calcite-linq4j-1.10.0.jar:/Users/jinxing/.m2/repository/net/hydromatic/eigenbase-properties/1.1.5/eigenbase-properties-1.1.5.jar:/Users/jinxing/.m2/repository/org/apache/calcite/calcite-druid/1.10.0/calcite-druid-1.10.0.jar:/Users/jinxing/.m2/repository/org/apache/calcite/avatica/avatica/1.8.0/avatica-1.8.0.jar:/Users/jinxing/.m2/repository/org/apache/calcite/avatica/avatica-metrics/1.8.0/avatica-metrics-1.8.0.jar:/Users/jinxing/.m2/repository/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-jdbc/2.3.1/hive-jdbc-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-service/2.3.1/hive-service-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-llap-server/2.3.1/hive-llap-server-2.3.1.jar:/Users/ji
nxing/.m2/repository/org/apache/slider/slider-core/0.90.2-incubating/slider-core-0.90.2-incubating.jar:/Users/jinxing/.m2/repository/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-llap-common/2.3.1/hive-llap-common-2.3.1-tests.jar:/Users/jinxing/.m2/repository/net/sf/jpam/jpam/1.1/jpam-1.1.jar:/Users/jinxing/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/jinxing/.m2/repository/ant/ant/1.6.5/ant-1.6.5.jar:/Users/jinxing/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/jinxing/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-service-rpc/2.3.1/hive-service-rpc-2.3.1.jar:/Users/jinxing/.m2/repository/org/apache/httpcomponents/httpcore/4.4.1/httpcore-4.4.1.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-metastore/2.3.1/hive-metastore-2.3.1.jar:/Users/jinxing/.m2/repository/javolution/javol
ution/5.5.1/javolution-5.5.1.jar:/Users/jinxing/.m2/repository/com/jolbox/bonecp/0.8.0.RELEASE/bonecp-0.8.0.RELEASE.jar:/Users/jinxing/.m2/repository/com/zaxxer/HikariCP/2.5.1/HikariCP-2.5.1.jar:/Users/jinxing/.m2/repository/org/datanucleus/datanucleus-api-jdo/4.2.4/datanucleus-api-jdo-4.2.4.jar:/Users/jinxing/.m2/repository/org/datanucleus/datanucleus-rdbms/4.1.19/datanucleus-rdbms-4.1.19.jar:/Users/jinxing/.m2/repository/commons-pool/commons-pool/1.5.4/commons-pool-1.5.4.jar:/Users/jinxing/.m2/repository/commons-dbcp/commons-dbcp/1.4/commons-dbcp-1.4.jar:/Users/jinxing/.m2/repository/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar:/Users/jinxing/.m2/repository/org/datanucleus/javax.jdo/3.2.0-m3/javax.jdo-3.2.0-m3.jar:/Users/jinxing/.m2/repository/co/cask/tephra/tephra-api/0.6.0/tephra-api-0.6.0.jar:/Users/jinxing/.m2/repository/co/cask/tephra/tephra-core/0.6.0/tephra-core-0.6.0.jar:/Users/jinxing/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/jinxing/.m2/repository/java
x/inject/javax.inject/1/javax.inject-1.jar:/Users/jinxing/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/jinxing/.m2/repository/com/google/inject/extensions/guice-assistedinject/3.0/guice-assistedinject-3.0.jar:/Users/jinxing/.m2/repository/it/unimi/dsi/fastutil/6.5.6/fastutil-6.5.6.jar:/Users/jinxing/.m2/repository/org/apache/twill/twill-common/0.6.0-incubating/twill-common-0.6.0-incubating.jar:/Users/jinxing/.m2/repository/org/apache/twill/twill-core/0.6.0-incubating/twill-core-0.6.0-incubating.jar:/Users/jinxing/.m2/repository/org/apache/twill/twill-api/0.6.0-incubating/twill-api-0.6.0-incubating.jar:/Users/jinxing/.m2/repository/org/apache/twill/twill-discovery-api/0.6.0-incubating/twill-discovery-api-0.6.0-incubating.jar:/Users/jinxing/.m2/repository/org/apache/twill/twill-discovery-core/0.6.0-incubating/twill-discovery-core-0.6.0-incubating.jar:/Users/jinxing/.m2/repository/org/apache/twill/twill-zookeeper/0.6.0-incubating/twill-zookeeper-0.6.0-incubatin
g.jar:/Users/jinxing/.m2/repository/co/cask/tephra/tephra-hbase-compat-1.0/0.6.0/tephra-hbase-compat-1.0-0.6.0.jar:/Users/jinxing/.m2/repository/org/apache/hive/hive-common/2.3.1/hive-common-2.3.1.jar:/Users/jinxing/.m2/repository/jline/jline/2.12/jline-2.12.jar:/Users/jinxing/.m2/repository/com/tdunning/json/1.8/json-1.8.jar:/Users/jinxing/.m2/repository/com/github/joshelser/dropwizard-metrics-hadoop-metrics2-reporter/0.1.2/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/Users/jinxing/.m2/repository/org/apache/curator/curator-framework/2.7.1/curator-framework-2.7.1.jar:/Users/jinxing/.m2/repository/org/apache/curator/curator-client/2.7.1/curator-client-2.7.1.jar:/Users/jinxing/.m2/repository/org/apache/curator/curator-recipes/2.7.1/curator-recipes-2.7.1.jar:/Users/jinxing/workspace/hudi/hudi-client/hudi-client-common/target/test-classes:/Users/jinxing/workspace/hudi/hudi-client/hudi-spark-client/target/test-classes:/Users/jinxing/workspace/hudi/hudi-common/target/test-classe
s:/Users/jinxing/workspace/hudi/hudi-client/hudi-java-client/target/classes:/Users/jinxing/.m2/repository/org/scalatest/scalatest_2.12/3.1.0/scalatest_2.12-3.1.0.jar:/Users/jinxing/.m2/repository/org/scalatest/scalatest-compatible/3.1.0/scalatest-compatible-3.1.0.jar:/Users/jinxing/.m2/repository/org/scalactic/scalactic_2.12/3.1.0/scalactic_2.12-3.1.0.jar:/Users/jinxing/.m2/repository/org/junit/jupiter/junit-jupiter-api/5.7.0-M1/junit-jupiter-api-5.7.0-M1.jar:/Users/jinxing/.m2/repository/org/apiguardian/apiguardian-api/1.1.0/apiguardian-api-1.1.0.jar:/Users/jinxing/.m2/repository/org/opentest4j/opentest4j/1.2.0/opentest4j-1.2.0.jar:/Users/jinxing/.m2/repository/org/junit/platform/junit-platform-commons/1.7.0-M1/junit-platform-commons-1.7.0-M1.jar:/Users/jinxing/.m2/repository/org/junit/jupiter/junit-jupiter-engine/5.7.0-M1/junit-jupiter-engine-5.7.0-M1.jar:/Users/jinxing/.m2/repository/org/junit/platform/junit-platform-engine/1.7.0-M1/junit-platform-engine-1.7.0-M1.jar:/Users/jinxi
ng/.m2/repository/org/junit/vintage/junit-vintage-engine/5.7.0-M1/junit-vintage-engine-5.7.0-M1.jar:/Users/jinxing/.m2/repository/junit/junit/4.13/junit-4.13.jar:/Users/jinxing/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/Users/jinxing/.m2/repository/org/junit/jupiter/junit-jupiter-params/5.7.0-M1/junit-jupiter-params-5.7.0-M1.jar:/Users/jinxing/.m2/repository/org/mockito/mockito-junit-jupiter/3.3.3/mockito-junit-jupiter-3.3.3.jar:/Users/jinxing/.m2/repository/org/mockito/mockito-core/3.3.3/mockito-core-3.3.3.jar:/Users/jinxing/.m2/repository/net/bytebuddy/byte-buddy/1.10.5/byte-buddy-1.10.5.jar:/Users/jinxing/.m2/repository/net/bytebuddy/byte-buddy-agent/1.10.5/byte-buddy-agent-1.10.5.jar:/Users/jinxing/.m2/repository/org/junit/platform/junit-platform-runner/1.7.0-M1/junit-platform-runner-1.7.0-M1.jar:/Users/jinxing/.m2/repository/org/junit/platform/junit-platform-launcher/1.7.0-M1/junit-platform-launcher-1.7.0-M1.jar:/Users/jinxing/.m2/repository/org/junit/
platform/junit-platform-suite-api/1.7.0-M1/junit-platform-suite-api-1.7.0-M1.jar:/Users/jinxing/.m2/repository/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar:/Users/jinxing/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.10.1/hadoop-hdfs-2.10.1-tests.jar:/Users/jinxing/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/jinxing/.m2/repository/io/netty/netty/3.10.6.Final/netty-3.10.6.Final.jar:/Users/jinxing/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/Users/jinxing/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/Users/jinxing/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar" org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner -s org.apache.spark.sql.hudi.TestCreateTable -testName "Test Create Table As Select With Tblproperties For Filter Props" -showProgressMessages true
Testing started at 4:58 下午 ...
0 [ScalaTest-run-running-TestCreateTable] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
5952 [ScalaTest-run-running-TestCreateTable] WARN org.apache.hudi.common.config.DFSPropertiesConfiguration - Cannot find HUDI_CONF_DIR, please set it as the dir of hudi-defaults.conf
5973 [ScalaTest-run-running-TestCreateTable] WARN org.apache.hudi.common.config.DFSPropertiesConfiguration - Properties file file:/etc/hudi/conf/hudi-defaults.conf not found. Ignoring to load props file
6532 [ScalaTest-run-running-TestCreateTable] WARN org.apache.spark.sql.catalyst.util.package - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.sql.debug.maxToStringFields'.
6556 [ScalaTest-run-running-TestCreateTable] ERROR org.apache.spark.util.Utils - Aborting task
org.apache.hudi.exception.HoodieException: Config conflict(key current value existing value):
hoodie.database.name: databaseName default
at org.apache.hudi.HoodieWriterUtils$.validateTableConfig(HoodieWriterUtils.scala:161)
at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:86)
at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:163)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:128)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
at org.apache.spark.sql.hudi.catalog.HoodieCatalog.$anonfun$saveSourceDF$1(HoodieCatalog.scala:301)
at scala.Option.map(Option.scala:230)
at org.apache.spark.sql.hudi.catalog.HoodieCatalog.saveSourceDF(HoodieCatalog.scala:298)
at org.apache.spark.sql.hudi.catalog.HoodieCatalog.createHoodieTable(HoodieCatalog.scala:256)
at org.apache.spark.sql.hudi.catalog.HoodieStagedTable.commitStagedChanges(HoodieStagedTable.scala:62)
at org.apache.spark.sql.execution.datasources.v2.TableWriteExecHelper.$anonfun$writeToTable$1(WriteToDataSourceV2Exec.scala:484)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1496)
at org.apache.spark.sql.execution.datasources.v2.TableWriteExecHelper.writeToTable(WriteToDataSourceV2Exec.scala:468)
at org.apache.spark.sql.execution.datasources.v2.TableWriteExecHelper.writeToTable$(WriteToDataSourceV2Exec.scala:463)
at org.apache.spark.sql.execution.datasources.v2.AtomicCreateTableAsSelectExec.writeToTable(WriteToDataSourceV2Exec.scala:106)
at org.apache.spark.sql.execution.datasources.v2.AtomicCreateTableAsSelectExec.run(WriteToDataSourceV2Exec.scala:127)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$20(TestCreateTable.scala:401)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$20$adapted(TestCreateTable.scala:387)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$19(TestCreateTable.scala:387)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$19$adapted(TestCreateTable.scala:386)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.withTempDir(HoodieSparkSqlTestBase.scala:70)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$18(TestCreateTable.scala:386)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.$anonfun$test$1(HoodieSparkSqlTestBase.scala:78)
at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:189)
at org.scalatest.TestSuite.withFixture(TestSuite.scala:196)
at org.scalatest.TestSuite.withFixture$(TestSuite.scala:195)
at org.scalatest.funsuite.AnyFunSuite.withFixture(AnyFunSuite.scala:1562)
at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:187)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:199)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:199)
at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:181)
at org.scalatest.funsuite.AnyFunSuite.runTest(AnyFunSuite.scala:1562)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:232)
at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:232)
at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:231)
at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1562)
at org.scalatest.Suite.run(Suite.scala:1112)
at org.scalatest.Suite.run$(Suite.scala:1094)
at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1562)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:236)
at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:236)
at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:235)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.org$scalatest$BeforeAndAfterAll$$super$run(HoodieSparkSqlTestBase.scala:34)
at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.run(HoodieSparkSqlTestBase.scala:34)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1314)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1308)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1308)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1474)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
at org.scalatest.tools.Runner$.run(Runner.scala:798)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:38)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:25)
Config conflict(key current value existing value):
hoodie.database.name: databaseName default
org.apache.hudi.exception.HoodieException: Config conflict(key current value existing value):
hoodie.database.name: databaseName default
at org.apache.hudi.HoodieWriterUtils$.validateTableConfig(HoodieWriterUtils.scala:161)
at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:86)
at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:163)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:128)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
at org.apache.spark.sql.hudi.catalog.HoodieCatalog.$anonfun$saveSourceDF$1(HoodieCatalog.scala:301)
at scala.Option.map(Option.scala:230)
at org.apache.spark.sql.hudi.catalog.HoodieCatalog.saveSourceDF(HoodieCatalog.scala:298)
at org.apache.spark.sql.hudi.catalog.HoodieCatalog.createHoodieTable(HoodieCatalog.scala:256)
at org.apache.spark.sql.hudi.catalog.HoodieStagedTable.commitStagedChanges(HoodieStagedTable.scala:62)
at org.apache.spark.sql.execution.datasources.v2.TableWriteExecHelper.$anonfun$writeToTable$1(WriteToDataSourceV2Exec.scala:484)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1496)
at org.apache.spark.sql.execution.datasources.v2.TableWriteExecHelper.writeToTable(WriteToDataSourceV2Exec.scala:468)
at org.apache.spark.sql.execution.datasources.v2.TableWriteExecHelper.writeToTable$(WriteToDataSourceV2Exec.scala:463)
at org.apache.spark.sql.execution.datasources.v2.AtomicCreateTableAsSelectExec.writeToTable(WriteToDataSourceV2Exec.scala:106)
at org.apache.spark.sql.execution.datasources.v2.AtomicCreateTableAsSelectExec.run(WriteToDataSourceV2Exec.scala:127)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$20(TestCreateTable.scala:401)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$20$adapted(TestCreateTable.scala:387)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$19(TestCreateTable.scala:387)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$19$adapted(TestCreateTable.scala:386)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.withTempDir(HoodieSparkSqlTestBase.scala:70)
at org.apache.spark.sql.hudi.TestCreateTable.$anonfun$new$18(TestCreateTable.scala:386)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.$anonfun$test$1(HoodieSparkSqlTestBase.scala:78)
at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:189)
at org.scalatest.TestSuite.withFixture(TestSuite.scala:196)
at org.scalatest.TestSuite.withFixture$(TestSuite.scala:195)
at org.scalatest.funsuite.AnyFunSuite.withFixture(AnyFunSuite.scala:1562)
at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:187)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:199)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:199)
at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:181)
at org.scalatest.funsuite.AnyFunSuite.runTest(AnyFunSuite.scala:1562)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:232)
at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:232)
at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:231)
at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1562)
at org.scalatest.Suite.run(Suite.scala:1112)
at org.scalatest.Suite.run$(Suite.scala:1094)
at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1562)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:236)
at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:236)
at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:235)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.org$scalatest$BeforeAndAfterAll$$super$run(HoodieSparkSqlTestBase.scala:34)
at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
at org.apache.spark.sql.hudi.HoodieSparkSqlTestBase.run(HoodieSparkSqlTestBase.scala:34)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1314)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1308)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1308)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1474)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
at org.scalatest.tools.Runner$.run(Runner.scala:798)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:38)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:25)
Process finished with exit code 0
ANTLR Tool version 4.7 used for code generation does not match the current runtime version 4.8ANTLR Tool version 4.7 used for code generation does not match the current runtime version 4.8ANTLR Tool version 4.7 used for code generation does not match the current runtime version 4.8ANTLR Tool version 4.7 used for code generation does not match the current runtime version 4.8
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] hudi-bot commented on pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
hudi-bot commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127601521
<!--
Meta data
{
"version" : 1,
"metaDataEntries" : [ {
"hash" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"status" : "SUCCESS",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677",
"triggerID" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"triggerType" : "PUSH"
}, {
"hash" : "ff9686f608630edfc9d846e4bd7a0d7bf64eec86",
"status" : "PENDING",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8683",
"triggerID" : "ff9686f608630edfc9d846e4bd7a0d7bf64eec86",
"triggerType" : "PUSH"
} ]
}-->
## CI report:
* 9c37bde227c389d50b382453f0e71f57e3e4b7f5 Azure: [SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677)
* ff9686f608630edfc9d846e4bd7a0d7bf64eec86 Azure: [PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8683)
<details>
<summary>Bot commands</summary>
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run the last Azure build
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] hudi-bot commented on pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
hudi-bot commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127597519
<!--
Meta data
{
"version" : 1,
"metaDataEntries" : [ {
"hash" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"status" : "SUCCESS",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677",
"triggerID" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"triggerType" : "PUSH"
}, {
"hash" : "ff9686f608630edfc9d846e4bd7a0d7bf64eec86",
"status" : "UNKNOWN",
"url" : "TBD",
"triggerID" : "ff9686f608630edfc9d846e4bd7a0d7bf64eec86",
"triggerType" : "PUSH"
} ]
}-->
## CI report:
* 9c37bde227c389d50b382453f0e71f57e3e4b7f5 Azure: [SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677)
* ff9686f608630edfc9d846e4bd7a0d7bf64eec86 UNKNOWN
<details>
<summary>Bot commands</summary>
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run the last Azure build
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] hudi-bot commented on pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
hudi-bot commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127425836
<!--
Meta data
{
"version" : 1,
"metaDataEntries" : [ {
"hash" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"status" : "UNKNOWN",
"url" : "TBD",
"triggerID" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"triggerType" : "PUSH"
} ]
}-->
## CI report:
* 9c37bde227c389d50b382453f0e71f57e3e4b7f5 UNKNOWN
<details>
<summary>Bot commands</summary>
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run the last Azure build
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] hudi-bot commented on pull request #5592: [HUDI-4103] [HUDI-4001] Filter the properties should not be used when create table for Spark SQL
Posted by GitBox <gi...@apache.org>.
hudi-bot commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127814078
<!--
Meta data
{
"version" : 1,
"metaDataEntries" : [ {
"hash" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"status" : "DELETED",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677",
"triggerID" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"triggerType" : "PUSH"
}, {
"hash" : "ff9686f608630edfc9d846e4bd7a0d7bf64eec86",
"status" : "SUCCESS",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8683",
"triggerID" : "ff9686f608630edfc9d846e4bd7a0d7bf64eec86",
"triggerType" : "PUSH"
} ]
}-->
## CI report:
* ff9686f608630edfc9d846e4bd7a0d7bf64eec86 Azure: [SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8683)
<details>
<summary>Bot commands</summary>
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run the last Azure build
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] dongkelun commented on a diff in pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
dongkelun commented on code in PR #5592:
URL: https://github.com/apache/hudi/pull/5592#discussion_r873489948
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestCreateTable.scala:
##########
@@ -383,80 +383,86 @@ class TestCreateTable extends HoodieSparkSqlTestBase {
}
test("Test Create Table As Select With Tblproperties For Filter Props") {
Review Comment:
@jinxing64 I'm not sure I can fix it now, because I can't reproduce this problem locally, so I submit a PR to test
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] hudi-bot commented on pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
hudi-bot commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127593729
<!--
Meta data
{
"version" : 1,
"metaDataEntries" : [ {
"hash" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"status" : "SUCCESS",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677",
"triggerID" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"triggerType" : "PUSH"
} ]
}-->
## CI report:
* 9c37bde227c389d50b382453f0e71f57e3e4b7f5 Azure: [SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677)
<details>
<summary>Bot commands</summary>
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run the last Azure build
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] dongkelun commented on a diff in pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable for Spark3.2
Posted by GitBox <gi...@apache.org>.
dongkelun commented on code in PR #5592:
URL: https://github.com/apache/hudi/pull/5592#discussion_r873719566
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestCreateTable.scala:
##########
@@ -383,80 +383,86 @@ class TestCreateTable extends HoodieSparkSqlTestBase {
}
test("Test Create Table As Select With Tblproperties For Filter Props") {
Review Comment:
@jinxing64 Thank you. I've found the reason
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] hudi-bot commented on pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable
Posted by GitBox <gi...@apache.org>.
hudi-bot commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127429666
<!--
Meta data
{
"version" : 1,
"metaDataEntries" : [ {
"hash" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"status" : "PENDING",
"url" : "https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677",
"triggerID" : "9c37bde227c389d50b382453f0e71f57e3e4b7f5",
"triggerType" : "PUSH"
} ]
}-->
## CI report:
* 9c37bde227c389d50b382453f0e71f57e3e4b7f5 Azure: [PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=8677)
<details>
<summary>Bot commands</summary>
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run the last Azure build
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] dongkelun commented on pull request #5592: [HUDI-4103] Fix CTAS test failures in TestCreateTable for Spark3.2
Posted by GitBox <gi...@apache.org>.
dongkelun commented on PR #5592:
URL: https://github.com/apache/hudi/pull/5592#issuecomment-1127658812
@XuQianJin-Stars The CI have passed,can you please take a review?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] XuQianJin-Stars merged pull request #5592: [HUDI-4103] [HUDI-4001] Filter the properties should not be used when create table for Spark SQL
Posted by GitBox <gi...@apache.org>.
XuQianJin-Stars merged PR #5592:
URL: https://github.com/apache/hudi/pull/5592
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org