You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by gi...@apache.org on 2022/10/08 16:52:48 UTC

[hudi] branch dependabot/maven/com.google.protobuf-protobuf-java-3.21.7 updated (050ce213e4 -> efdbd9edeb)

This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch dependabot/maven/com.google.protobuf-protobuf-java-3.21.7
in repository https://gitbox.apache.org/repos/asf/hudi.git


 discard 050ce213e4 Bump protobuf-java from 3.21.5 to 3.21.7
     add 48e5bb0fed [HOTFIX] Fix source release validate script (#6865)
     add 9f5d16529d [HUDI-4980] Calculate avg record size using commit only (#6864)
     add 067cc24d88 Revert "[HUDI-4915] improve avro serializer/deserializer (#6788)" (#6809)
     add fb4f026580 [HUDI-4970] Update kafka-connect readme and refactor HoodieConfig#create (#6857)
     add 280194d3b6 Enhancing README for multi-writer tests (#6870)
     add fd8a947e61 [MINOR] Fix deploy script for flink 1.15 (#6872)
     add a51181726c [HUDI-4992] Fixing invalid min/max record key stats in Parquet metadata (#6883)
     add c5125d38b5 [HUDI-4972] Fixes to make unit tests work on m1 mac (#6751)
     add 06d924137b [HUDI-2786] Docker demo on mac aarch64 (#6859)
     add 9c1fa14fd6 add support for unraveling proto schemas
     add 510d525e15 fix some compile issues
     add aad9ec1320 naming and style updates
     add 8899274444 make test data random, reuse code
     add a922a5beca add test for 2 different recursion depths, fix schema cache key
     add 3b37dc95d9 add unsigned long support
     add 706291d4f3 better handle other types
     add c28e874fca rebase on 4904
     add 190cc16381 get all tests working
     add f18fff886e fix oneof expected schema, update tests after rebase
     add ff5baa8706 revert scala binary change
     add 0069da2d1a try a different method to avoid avro version
     add 71a39bf488 Merge remote-tracking branch 'origin/master' into HUDI-4905
     add c5dff63375 delete unused file
     add f53d47ea3b address PR feedback, update decimal precision
     add 1831639e39 fix isNullable issue, check if class is Int64value
     add eca2992d65 checkstyle fix
     add 423da6f7bb change wrapper descriptor set initialization
     add fb2d9f0030 add in testing for unsigned long to BigInteger conversion
     add f03f9610cf shade protobuf dependency
     add 57f8b81194 Merge remote-tracking branch 'origin/master' into HUDI-4905
     add 7d5b9dc0a9 Revert "shade protobuf dependency"
     add 5d2c2853ea [HUDI-4905] Improve type handling in proto schema conversion
     add 182475a854 [HUDI-4971] Fix shading kryo-shaded with reusing configs (#6873)
     add efdbd9edeb Bump protobuf-java from 3.21.5 to 3.21.7

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (050ce213e4)
            \
             N -- N -- N   refs/heads/dependabot/maven/com.google.protobuf-protobuf-java-3.21.7 (efdbd9edeb)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 ...ose_hadoop284_hive233_spark244_mac_aarch64.yml} | 131 +++--------
 docker/setup_demo.sh                               |  10 +-
 docker/stop_demo.sh                                |   7 +-
 .../cli/commands/TestUpgradeDowngradeCommand.java  |   6 +-
 .../apache/hudi/io/storage/HoodieOrcWriter.java    |  10 +-
 .../hudi/avro/TestHoodieAvroParquetWriter.java     | 118 ++++++++++
 .../hudi/io/storage/TestHoodieOrcReaderWriter.java |   7 +-
 .../row/HoodieRowDataParquetWriteSupport.java      |  55 ++---
 .../storage/row/HoodieRowParquetWriteSupport.java  |  61 +++---
 .../table/action/commit/UpsertPartitioner.java     |  16 +-
 .../row/TestHoodieInternalRowParquetWriter.java    |  95 ++++----
 .../hudi/table/upgrade/TestUpgradeDowngrade.java   |   6 +-
 .../apache/hudi/avro/HoodieAvroWriteSupport.java   |  60 +++--
 .../hudi/avro/HoodieBloomFilterWriteSupport.java   |  96 ++++++++
 .../apache/hudi/common/config/HoodieConfig.java    |   9 +-
 .../org/apache/hudi/common/util/BaseFileUtils.java |  13 +-
 .../hudi/avro/TestHoodieAvroWriteSupport.java      |  67 ------
 hudi-examples/hudi-examples-java/pom.xml           |   6 +
 hudi-integ-test/README.md                          |  52 ++++-
 hudi-kafka-connect/README.md                       |  11 +-
 .../TestUpgradeOrDowngradeProcedure.scala          |   5 +-
 .../apache/spark/sql/avro/AvroDeserializer.scala   |  20 +-
 .../org/apache/spark/sql/avro/AvroSerializer.scala |  17 +-
 .../apache/spark/sql/avro/AvroDeserializer.scala   |  20 +-
 .../org/apache/spark/sql/avro/AvroSerializer.scala |  19 +-
 .../apache/spark/sql/avro/AvroDeserializer.scala   |  20 +-
 .../org/apache/spark/sql/avro/AvroSerializer.scala |  19 +-
 .../apache/spark/sql/avro/AvroDeserializer.scala   |  20 +-
 .../org/apache/spark/sql/avro/AvroSerializer.scala |  19 +-
 hudi-timeline-service/pom.xml                      |   5 +
 hudi-utilities/pom.xml                             |   1 -
 .../schema/ProtoClassBasedSchemaProvider.java      |  19 +-
 .../sources/helpers/ProtoConversionUtil.java       | 242 ++++++++++++++-------
 .../schema/TestProtoClassBasedSchemaProvider.java  |  21 +-
 .../utilities/sources/TestProtoKafkaSource.java    |   2 +-
 .../sources/helpers/TestProtoConversionUtil.java   | 100 +++++++--
 .../schema-provider/proto/oneof_schema.avsc        |  42 ++++
 .../resources/schema-provider/proto/sample.proto   |   8 +
 ..._flattened.avsc => sample_schema_defaults.avsc} |  31 ++-
 ...le_schema_wrapped_and_timestamp_as_record.avsc} |  16 +-
 packaging/hudi-aws-bundle/pom.xml                  |  16 +-
 packaging/hudi-datahub-sync-bundle/pom.xml         |  16 +-
 packaging/hudi-flink-bundle/pom.xml                |  19 +-
 packaging/hudi-gcp-bundle/pom.xml                  |  16 +-
 packaging/hudi-hadoop-mr-bundle/pom.xml            |  19 +-
 packaging/hudi-hive-sync-bundle/pom.xml            |  19 +-
 packaging/hudi-integ-test-bundle/pom.xml           |  19 +-
 packaging/hudi-kafka-connect-bundle/pom.xml        |   7 +-
 packaging/hudi-presto-bundle/pom.xml               |  20 +-
 packaging/hudi-spark-bundle/pom.xml                |   4 +-
 packaging/hudi-timeline-server-bundle/pom.xml      |   7 +-
 packaging/hudi-trino-bundle/pom.xml                |  19 +-
 packaging/hudi-utilities-bundle/pom.xml            |   4 +-
 packaging/hudi-utilities-slim-bundle/pom.xml       |   4 +-
 pom.xml                                            |  69 +++++-
 scripts/release/deploy_staging_jars.sh             |   2 +-
 scripts/release/validate_staged_bundles.sh         |   1 +
 scripts/release/validate_staged_release.sh         |  20 +-
 58 files changed, 1058 insertions(+), 705 deletions(-)
 copy docker/compose/{docker-compose_hadoop284_hive233_spark244.yml => docker-compose_hadoop284_hive233_spark244_mac_aarch64.yml} (65%)
 create mode 100644 hudi-client/hudi-client-common/src/test/java/org/apache/hudi/avro/TestHoodieAvroParquetWriter.java
 create mode 100644 hudi-common/src/main/java/org/apache/hudi/avro/HoodieBloomFilterWriteSupport.java
 delete mode 100644 hudi-common/src/test/java/org/apache/hudi/avro/TestHoodieAvroWriteSupport.java
 create mode 100644 hudi-utilities/src/test/resources/schema-provider/proto/oneof_schema.avsc
 rename hudi-utilities/src/test/resources/schema-provider/proto/{sample_schema_flattened.avsc => sample_schema_defaults.avsc} (92%)
 rename hudi-utilities/src/test/resources/schema-provider/proto/{sample_schema_nested.avsc => sample_schema_wrapped_and_timestamp_as_record.avsc} (95%)