You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/02/21 23:40:17 UTC
[spark] branch branch-3.4 updated: [SPARK-42002][CONNECT][FOLLOW-UP] Add Required/Optional notions to writer v2 proto
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new 9a7881a0192 [SPARK-42002][CONNECT][FOLLOW-UP] Add Required/Optional notions to writer v2 proto
9a7881a0192 is described below
commit 9a7881a019225564439adfdcf556b0fab9928709
Author: Rui Wang <ru...@databricks.com>
AuthorDate: Wed Feb 22 08:39:51 2023 +0900
[SPARK-42002][CONNECT][FOLLOW-UP] Add Required/Optional notions to writer v2 proto
### What changes were proposed in this pull request?
Follow existing proto style guide, we should always add `Required/Optional` to proto documentation.
### Why are the changes needed?
Improve documentation.
### Does this PR introduce _any_ user-facing change?
NO
### How was this patch tested?
N/A
Closes #40106 from amaliujia/rw-fix-proto.
Authored-by: Rui Wang <ru...@databricks.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit 5097c669ffae23997db00b8f2eec89abb4f33cfc)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
.../connect/common/src/main/protobuf/spark/connect/commands.proto | 5 +++--
python/pyspark/sql/connect/proto/commands_pb2.pyi | 5 +++--
2 files changed, 6 insertions(+), 4 deletions(-)
diff --git a/connector/connect/common/src/main/protobuf/spark/connect/commands.proto b/connector/connect/common/src/main/protobuf/spark/connect/commands.proto
index 88d7e81beec..7567b0e3d7c 100644
--- a/connector/connect/common/src/main/protobuf/spark/connect/commands.proto
+++ b/connector/connect/common/src/main/protobuf/spark/connect/commands.proto
@@ -123,10 +123,10 @@ message WriteOperationV2 {
// (Required) The output of the `input` relation will be persisted according to the options.
Relation input = 1;
- // The destination of the write operation must be either a path or a table.
+ // (Required) The destination of the write operation must be either a path or a table.
string table_name = 2;
- // A provider for the underlying output data source. Spark's default catalog supports
+ // (Optional) A provider for the underlying output data source. Spark's default catalog supports
// "parquet", "json", etc.
string provider = 3;
@@ -140,6 +140,7 @@ message WriteOperationV2 {
// (Optional) A list of table properties.
map<string, string> table_properties = 6;
+ // (Required) Write mode.
Mode mode = 7;
enum Mode {
diff --git a/python/pyspark/sql/connect/proto/commands_pb2.pyi b/python/pyspark/sql/connect/proto/commands_pb2.pyi
index 46d1921efc2..c102624ca44 100644
--- a/python/pyspark/sql/connect/proto/commands_pb2.pyi
+++ b/python/pyspark/sql/connect/proto/commands_pb2.pyi
@@ -474,9 +474,9 @@ class WriteOperationV2(google.protobuf.message.Message):
def input(self) -> pyspark.sql.connect.proto.relations_pb2.Relation:
"""(Required) The output of the `input` relation will be persisted according to the options."""
table_name: builtins.str
- """The destination of the write operation must be either a path or a table."""
+ """(Required) The destination of the write operation must be either a path or a table."""
provider: builtins.str
- """A provider for the underlying output data source. Spark's default catalog supports
+ """(Optional) A provider for the underlying output data source. Spark's default catalog supports
"parquet", "json", etc.
"""
@property
@@ -497,6 +497,7 @@ class WriteOperationV2(google.protobuf.message.Message):
) -> google.protobuf.internal.containers.ScalarMap[builtins.str, builtins.str]:
"""(Optional) A list of table properties."""
mode: global___WriteOperationV2.Mode.ValueType
+ """(Required) Write mode."""
@property
def overwrite_condition(self) -> pyspark.sql.connect.proto.expressions_pb2.Expression:
"""(Optional) A condition for overwrite saving mode"""
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org