You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/03/23 03:44:32 UTC
[spark] branch branch-3.4 updated: [SPARK-42901][CONNECT][PYTHON] Move `StorageLevel` into a separate file to avoid potential `file recursively imports`
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new 253cb7a1c53 [SPARK-42901][CONNECT][PYTHON] Move `StorageLevel` into a separate file to avoid potential `file recursively imports`
253cb7a1c53 is described below
commit 253cb7a1c5384294115f55a11b4498fe3cac82d9
Author: yangjie01 <ya...@baidu.com>
AuthorDate: Thu Mar 23 12:43:42 2023 +0900
[SPARK-42901][CONNECT][PYTHON] Move `StorageLevel` into a separate file to avoid potential `file recursively imports`
https://github.com/apache/spark/pull/40510 introduce `message StorageLevel` to `base.proto`, but if we try to import `base.proto` in `catalog.proto` to reuse `StorageLevel` in `message CacheTable` and run `build/sbt "connect-common/compile" to compile, there will be following message in compile log:
```
spark/connect/base.proto:23:1: File recursively imports itself: spark/connect/base.proto -> spark/connect/commands.proto -> spark/connect/relations.proto -> spark/connect/catalog.proto -> spark/connect/base.proto
spark/connect/catalog.proto:22:1: Import "spark/connect/base.proto" was not found or had errors.
spark/connect/catalog.proto:144:12: "spark.connect.DataType" seems to be defined in "spark/connect/types.proto", which is not imported by "spark/connect/catalog.proto". To use it here, please add the necessary import.
spark/connect/catalog.proto:161:12: "spark.connect.DataType" seems to be defined in "spark/connect/types.proto", which is not imported by "spark/connect/catalog.proto". To use it here, please add the necessary import.
spark/connect/relations.proto:25:1: Import "spark/connect/catalog.proto" was not found or had errors.
spark/connect/relations.proto:84:5: "Catalog" is not defined.
spark/connect/commands.proto:22:1: Import "spark/connect/relations.proto" was not found or had errors.
spark/connect/commands.proto:63:3: "Relation" is not defined.
spark/connect/commands.proto:81:3: "Relation" is not defined.
spark/connect/commands.proto:142:3: "Relation" is not defined.
spark/connect/base.proto:23:1: Import "spark/connect/commands.proto" was not found or had errors.
spark/connect/base.proto:25:1: Import "spark/connect/relations.proto" was not found or had errors.
....
```
So this pr move `message StorageLevel` to a separate file to avoid this potential file recursively imports.
To avoid potential file recursively imports.
No
- Pass GitHub Actions
- Manual check:
- Add `import "spark/connect/common.proto";` to `catalog.proto`
- run `build/sbt "connect-common/compile"`
No compilation logs related to `File recursively imports itself` .
Closes #40518 from LuciferYang/SPARK-42889-FOLLOWUP.
Lead-authored-by: yangjie01 <ya...@baidu.com>
Co-authored-by: YangJie <ya...@baidu.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit 88cc2395786fd2e06f77b897288ac8a48c33c15e)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
.../src/main/protobuf/spark/connect/base.proto | 15 +-
.../src/main/protobuf/spark/connect/common.proto | 37 +++
python/pyspark/sql/connect/proto/__init__.py | 1 +
python/pyspark/sql/connect/proto/base_pb2.py | 253 ++++++++++-----------
python/pyspark/sql/connect/proto/base_pb2.pyi | 56 +----
python/pyspark/sql/connect/proto/common_pb2.py | 55 +++++
python/pyspark/sql/connect/proto/common_pb2.pyi | 93 ++++++++
7 files changed, 312 insertions(+), 198 deletions(-)
diff --git a/connector/connect/common/src/main/protobuf/spark/connect/base.proto b/connector/connect/common/src/main/protobuf/spark/connect/base.proto
index 591f32cea1b..530edb2d8c0 100644
--- a/connector/connect/common/src/main/protobuf/spark/connect/base.proto
+++ b/connector/connect/common/src/main/protobuf/spark/connect/base.proto
@@ -21,6 +21,7 @@ package spark.connect;
import "google/protobuf/any.proto";
import "spark/connect/commands.proto";
+import "spark/connect/common.proto";
import "spark/connect/expressions.proto";
import "spark/connect/relations.proto";
import "spark/connect/types.proto";
@@ -54,20 +55,6 @@ message UserContext {
repeated google.protobuf.Any extensions = 999;
}
-// StorageLevel for persisting Datasets/Tables.
-message StorageLevel {
- // (Required) Whether the cache should use disk or not.
- bool use_disk = 1;
- // (Required) Whether the cache should use memory or not.
- bool use_memory = 2;
- // (Required) Whether the cache should use off-heap or not.
- bool use_off_heap = 3;
- // (Required) Whether the cached data is deserialized or not.
- bool deserialized = 4;
- // (Required) The number of replicas.
- int32 replication = 5;
-}
-
// Request to perform plan analyze, optionally to explain the plan.
message AnalyzePlanRequest {
// (Required)
diff --git a/connector/connect/common/src/main/protobuf/spark/connect/common.proto b/connector/connect/common/src/main/protobuf/spark/connect/common.proto
new file mode 100644
index 00000000000..342588ea384
--- /dev/null
+++ b/connector/connect/common/src/main/protobuf/spark/connect/common.proto
@@ -0,0 +1,37 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+syntax = 'proto3';
+
+package spark.connect;
+
+option java_multiple_files = true;
+option java_package = "org.apache.spark.connect.proto";
+
+// StorageLevel for persisting Datasets/Tables.
+message StorageLevel {
+ // (Required) Whether the cache should use disk or not.
+ bool use_disk = 1;
+ // (Required) Whether the cache should use memory or not.
+ bool use_memory = 2;
+ // (Required) Whether the cache should use off-heap or not.
+ bool use_off_heap = 3;
+ // (Required) Whether the cached data is deserialized or not.
+ bool deserialized = 4;
+ // (Required) The number of replicas.
+ int32 replication = 5;
+}
diff --git a/python/pyspark/sql/connect/proto/__init__.py b/python/pyspark/sql/connect/proto/__init__.py
index a651a844e15..3e8d074d963 100644
--- a/python/pyspark/sql/connect/proto/__init__.py
+++ b/python/pyspark/sql/connect/proto/__init__.py
@@ -22,3 +22,4 @@ from pyspark.sql.connect.proto.commands_pb2 import *
from pyspark.sql.connect.proto.expressions_pb2 import *
from pyspark.sql.connect.proto.relations_pb2 import *
from pyspark.sql.connect.proto.catalog_pb2 import *
+from pyspark.sql.connect.proto.common_pb2 import *
diff --git a/python/pyspark/sql/connect/proto/base_pb2.py b/python/pyspark/sql/connect/proto/base_pb2.py
index a5222eca045..28dd46a8a2b 100644
--- a/python/pyspark/sql/connect/proto/base_pb2.py
+++ b/python/pyspark/sql/connect/proto/base_pb2.py
@@ -31,19 +31,19 @@ _sym_db = _symbol_database.Default()
from google.protobuf import any_pb2 as google_dot_protobuf_dot_any__pb2
from pyspark.sql.connect.proto import commands_pb2 as spark_dot_connect_dot_commands__pb2
+from pyspark.sql.connect.proto import common_pb2 as spark_dot_connect_dot_common__pb2
from pyspark.sql.connect.proto import expressions_pb2 as spark_dot_connect_dot_expressions__pb2
from pyspark.sql.connect.proto import relations_pb2 as spark_dot_connect_dot_relations__pb2
from pyspark.sql.connect.proto import types_pb2 as spark_dot_connect_dot_types__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
- b'\n\x18spark/connect/base.proto\x12\rspark.connect\x1a\x19google/protobuf/any.proto\x1a\x1cspark/connect/commands.proto\x1a\x1fspark/connect/expressions.proto\x1a\x1dspark/connect/relations.proto\x1a\x19spark/connect/types.proto"t\n\x04Plan\x12-\n\x04root\x18\x01 \x01(\x0b\x32\x17.spark.connect.RelationH\x00R\x04root\x12\x32\n\x07\x63ommand\x18\x02 \x01(\x0b\x32\x16.spark.connect.CommandH\x00R\x07\x63ommandB\t\n\x07op_type"z\n\x0bUserContext\x12\x17\n\x07user_id\x18\x01 \x01(\tR\x06 [...]
+ b'\n\x18spark/connect/base.proto\x12\rspark.connect\x1a\x19google/protobuf/any.proto\x1a\x1cspark/connect/commands.proto\x1a\x1aspark/connect/common.proto\x1a\x1fspark/connect/expressions.proto\x1a\x1dspark/connect/relations.proto\x1a\x19spark/connect/types.proto"t\n\x04Plan\x12-\n\x04root\x18\x01 \x01(\x0b\x32\x17.spark.connect.RelationH\x00R\x04root\x12\x32\n\x07\x63ommand\x18\x02 \x01(\x0b\x32\x16.spark.connect.CommandH\x00R\x07\x63ommandB\t\n\x07op_type"z\n\x0bUserContext\x12\x17 [...]
)
_PLAN = DESCRIPTOR.message_types_by_name["Plan"]
_USERCONTEXT = DESCRIPTOR.message_types_by_name["UserContext"]
-_STORAGELEVEL = DESCRIPTOR.message_types_by_name["StorageLevel"]
_ANALYZEPLANREQUEST = DESCRIPTOR.message_types_by_name["AnalyzePlanRequest"]
_ANALYZEPLANREQUEST_SCHEMA = _ANALYZEPLANREQUEST.nested_types_by_name["Schema"]
_ANALYZEPLANREQUEST_EXPLAIN = _ANALYZEPLANREQUEST.nested_types_by_name["Explain"]
@@ -138,17 +138,6 @@ UserContext = _reflection.GeneratedProtocolMessageType(
)
_sym_db.RegisterMessage(UserContext)
-StorageLevel = _reflection.GeneratedProtocolMessageType(
- "StorageLevel",
- (_message.Message,),
- {
- "DESCRIPTOR": _STORAGELEVEL,
- "__module__": "spark.connect.base_pb2"
- # @@protoc_insertion_point(class_scope:spark.connect.StorageLevel)
- },
-)
-_sym_db.RegisterMessage(StorageLevel)
-
AnalyzePlanRequest = _reflection.GeneratedProtocolMessageType(
"AnalyzePlanRequest",
(_message.Message,),
@@ -715,124 +704,122 @@ if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._serialized_options = b"\n\036org.apache.spark.connect.protoP\001"
_EXECUTEPLANRESPONSE_METRICS_METRICOBJECT_EXECUTIONMETRICSENTRY._options = None
_EXECUTEPLANRESPONSE_METRICS_METRICOBJECT_EXECUTIONMETRICSENTRY._serialized_options = b"8\001"
- _PLAN._serialized_start = 191
- _PLAN._serialized_end = 307
- _USERCONTEXT._serialized_start = 309
- _USERCONTEXT._serialized_end = 431
- _STORAGELEVEL._serialized_start = 434
- _STORAGELEVEL._serialized_end = 610
- _ANALYZEPLANREQUEST._serialized_start = 613
- _ANALYZEPLANREQUEST._serialized_end = 2997
- _ANALYZEPLANREQUEST_SCHEMA._serialized_start = 1808
- _ANALYZEPLANREQUEST_SCHEMA._serialized_end = 1857
- _ANALYZEPLANREQUEST_EXPLAIN._serialized_start = 1860
- _ANALYZEPLANREQUEST_EXPLAIN._serialized_end = 2175
- _ANALYZEPLANREQUEST_EXPLAIN_EXPLAINMODE._serialized_start = 2003
- _ANALYZEPLANREQUEST_EXPLAIN_EXPLAINMODE._serialized_end = 2175
- _ANALYZEPLANREQUEST_TREESTRING._serialized_start = 2177
- _ANALYZEPLANREQUEST_TREESTRING._serialized_end = 2230
- _ANALYZEPLANREQUEST_ISLOCAL._serialized_start = 2232
- _ANALYZEPLANREQUEST_ISLOCAL._serialized_end = 2282
- _ANALYZEPLANREQUEST_ISSTREAMING._serialized_start = 2284
- _ANALYZEPLANREQUEST_ISSTREAMING._serialized_end = 2338
- _ANALYZEPLANREQUEST_INPUTFILES._serialized_start = 2340
- _ANALYZEPLANREQUEST_INPUTFILES._serialized_end = 2393
- _ANALYZEPLANREQUEST_SPARKVERSION._serialized_start = 2395
- _ANALYZEPLANREQUEST_SPARKVERSION._serialized_end = 2409
- _ANALYZEPLANREQUEST_DDLPARSE._serialized_start = 2411
- _ANALYZEPLANREQUEST_DDLPARSE._serialized_end = 2452
- _ANALYZEPLANREQUEST_SAMESEMANTICS._serialized_start = 2454
- _ANALYZEPLANREQUEST_SAMESEMANTICS._serialized_end = 2575
- _ANALYZEPLANREQUEST_SEMANTICHASH._serialized_start = 2577
- _ANALYZEPLANREQUEST_SEMANTICHASH._serialized_end = 2632
- _ANALYZEPLANREQUEST_PERSIST._serialized_start = 2635
- _ANALYZEPLANREQUEST_PERSIST._serialized_end = 2786
- _ANALYZEPLANREQUEST_UNPERSIST._serialized_start = 2788
- _ANALYZEPLANREQUEST_UNPERSIST._serialized_end = 2898
- _ANALYZEPLANREQUEST_GETSTORAGELEVEL._serialized_start = 2900
- _ANALYZEPLANREQUEST_GETSTORAGELEVEL._serialized_end = 2970
- _ANALYZEPLANRESPONSE._serialized_start = 3000
- _ANALYZEPLANRESPONSE._serialized_end = 4689
- _ANALYZEPLANRESPONSE_SCHEMA._serialized_start = 4108
- _ANALYZEPLANRESPONSE_SCHEMA._serialized_end = 4165
- _ANALYZEPLANRESPONSE_EXPLAIN._serialized_start = 4167
- _ANALYZEPLANRESPONSE_EXPLAIN._serialized_end = 4215
- _ANALYZEPLANRESPONSE_TREESTRING._serialized_start = 4217
- _ANALYZEPLANRESPONSE_TREESTRING._serialized_end = 4262
- _ANALYZEPLANRESPONSE_ISLOCAL._serialized_start = 4264
- _ANALYZEPLANRESPONSE_ISLOCAL._serialized_end = 4300
- _ANALYZEPLANRESPONSE_ISSTREAMING._serialized_start = 4302
- _ANALYZEPLANRESPONSE_ISSTREAMING._serialized_end = 4350
- _ANALYZEPLANRESPONSE_INPUTFILES._serialized_start = 4352
- _ANALYZEPLANRESPONSE_INPUTFILES._serialized_end = 4386
- _ANALYZEPLANRESPONSE_SPARKVERSION._serialized_start = 4388
- _ANALYZEPLANRESPONSE_SPARKVERSION._serialized_end = 4428
- _ANALYZEPLANRESPONSE_DDLPARSE._serialized_start = 4430
- _ANALYZEPLANRESPONSE_DDLPARSE._serialized_end = 4489
- _ANALYZEPLANRESPONSE_SAMESEMANTICS._serialized_start = 4491
- _ANALYZEPLANRESPONSE_SAMESEMANTICS._serialized_end = 4530
- _ANALYZEPLANRESPONSE_SEMANTICHASH._serialized_start = 4532
- _ANALYZEPLANRESPONSE_SEMANTICHASH._serialized_end = 4570
- _ANALYZEPLANRESPONSE_PERSIST._serialized_start = 2635
- _ANALYZEPLANRESPONSE_PERSIST._serialized_end = 2644
- _ANALYZEPLANRESPONSE_UNPERSIST._serialized_start = 2788
- _ANALYZEPLANRESPONSE_UNPERSIST._serialized_end = 2799
- _ANALYZEPLANRESPONSE_GETSTORAGELEVEL._serialized_start = 4596
- _ANALYZEPLANRESPONSE_GETSTORAGELEVEL._serialized_end = 4679
- _EXECUTEPLANREQUEST._serialized_start = 4692
- _EXECUTEPLANREQUEST._serialized_end = 4901
- _EXECUTEPLANRESPONSE._serialized_start = 4904
- _EXECUTEPLANRESPONSE._serialized_end = 6179
- _EXECUTEPLANRESPONSE_SQLCOMMANDRESULT._serialized_start = 5410
- _EXECUTEPLANRESPONSE_SQLCOMMANDRESULT._serialized_end = 5481
- _EXECUTEPLANRESPONSE_ARROWBATCH._serialized_start = 5483
- _EXECUTEPLANRESPONSE_ARROWBATCH._serialized_end = 5544
- _EXECUTEPLANRESPONSE_METRICS._serialized_start = 5547
- _EXECUTEPLANRESPONSE_METRICS._serialized_end = 6064
- _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT._serialized_start = 5642
- _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT._serialized_end = 5974
- _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT_EXECUTIONMETRICSENTRY._serialized_start = 5851
- _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT_EXECUTIONMETRICSENTRY._serialized_end = 5974
- _EXECUTEPLANRESPONSE_METRICS_METRICVALUE._serialized_start = 5976
- _EXECUTEPLANRESPONSE_METRICS_METRICVALUE._serialized_end = 6064
- _EXECUTEPLANRESPONSE_OBSERVEDMETRICS._serialized_start = 6066
- _EXECUTEPLANRESPONSE_OBSERVEDMETRICS._serialized_end = 6162
- _KEYVALUE._serialized_start = 6181
- _KEYVALUE._serialized_end = 6246
- _CONFIGREQUEST._serialized_start = 6249
- _CONFIGREQUEST._serialized_end = 7277
- _CONFIGREQUEST_OPERATION._serialized_start = 6469
- _CONFIGREQUEST_OPERATION._serialized_end = 6967
- _CONFIGREQUEST_SET._serialized_start = 6969
- _CONFIGREQUEST_SET._serialized_end = 7021
- _CONFIGREQUEST_GET._serialized_start = 7023
- _CONFIGREQUEST_GET._serialized_end = 7048
- _CONFIGREQUEST_GETWITHDEFAULT._serialized_start = 7050
- _CONFIGREQUEST_GETWITHDEFAULT._serialized_end = 7113
- _CONFIGREQUEST_GETOPTION._serialized_start = 7115
- _CONFIGREQUEST_GETOPTION._serialized_end = 7146
- _CONFIGREQUEST_GETALL._serialized_start = 7148
- _CONFIGREQUEST_GETALL._serialized_end = 7196
- _CONFIGREQUEST_UNSET._serialized_start = 7198
- _CONFIGREQUEST_UNSET._serialized_end = 7225
- _CONFIGREQUEST_ISMODIFIABLE._serialized_start = 7227
- _CONFIGREQUEST_ISMODIFIABLE._serialized_end = 7261
- _CONFIGRESPONSE._serialized_start = 7279
- _CONFIGRESPONSE._serialized_end = 7401
- _ADDARTIFACTSREQUEST._serialized_start = 7404
- _ADDARTIFACTSREQUEST._serialized_end = 8275
- _ADDARTIFACTSREQUEST_ARTIFACTCHUNK._serialized_start = 7791
- _ADDARTIFACTSREQUEST_ARTIFACTCHUNK._serialized_end = 7844
- _ADDARTIFACTSREQUEST_SINGLECHUNKARTIFACT._serialized_start = 7846
- _ADDARTIFACTSREQUEST_SINGLECHUNKARTIFACT._serialized_end = 7957
- _ADDARTIFACTSREQUEST_BATCH._serialized_start = 7959
- _ADDARTIFACTSREQUEST_BATCH._serialized_end = 8052
- _ADDARTIFACTSREQUEST_BEGINCHUNKEDARTIFACT._serialized_start = 8055
- _ADDARTIFACTSREQUEST_BEGINCHUNKEDARTIFACT._serialized_end = 8248
- _ADDARTIFACTSRESPONSE._serialized_start = 8278
- _ADDARTIFACTSRESPONSE._serialized_end = 8466
- _ADDARTIFACTSRESPONSE_ARTIFACTSUMMARY._serialized_start = 8385
- _ADDARTIFACTSRESPONSE_ARTIFACTSUMMARY._serialized_end = 8466
- _SPARKCONNECTSERVICE._serialized_start = 8469
- _SPARKCONNECTSERVICE._serialized_end = 8834
+ _PLAN._serialized_start = 219
+ _PLAN._serialized_end = 335
+ _USERCONTEXT._serialized_start = 337
+ _USERCONTEXT._serialized_end = 459
+ _ANALYZEPLANREQUEST._serialized_start = 462
+ _ANALYZEPLANREQUEST._serialized_end = 2846
+ _ANALYZEPLANREQUEST_SCHEMA._serialized_start = 1657
+ _ANALYZEPLANREQUEST_SCHEMA._serialized_end = 1706
+ _ANALYZEPLANREQUEST_EXPLAIN._serialized_start = 1709
+ _ANALYZEPLANREQUEST_EXPLAIN._serialized_end = 2024
+ _ANALYZEPLANREQUEST_EXPLAIN_EXPLAINMODE._serialized_start = 1852
+ _ANALYZEPLANREQUEST_EXPLAIN_EXPLAINMODE._serialized_end = 2024
+ _ANALYZEPLANREQUEST_TREESTRING._serialized_start = 2026
+ _ANALYZEPLANREQUEST_TREESTRING._serialized_end = 2079
+ _ANALYZEPLANREQUEST_ISLOCAL._serialized_start = 2081
+ _ANALYZEPLANREQUEST_ISLOCAL._serialized_end = 2131
+ _ANALYZEPLANREQUEST_ISSTREAMING._serialized_start = 2133
+ _ANALYZEPLANREQUEST_ISSTREAMING._serialized_end = 2187
+ _ANALYZEPLANREQUEST_INPUTFILES._serialized_start = 2189
+ _ANALYZEPLANREQUEST_INPUTFILES._serialized_end = 2242
+ _ANALYZEPLANREQUEST_SPARKVERSION._serialized_start = 2244
+ _ANALYZEPLANREQUEST_SPARKVERSION._serialized_end = 2258
+ _ANALYZEPLANREQUEST_DDLPARSE._serialized_start = 2260
+ _ANALYZEPLANREQUEST_DDLPARSE._serialized_end = 2301
+ _ANALYZEPLANREQUEST_SAMESEMANTICS._serialized_start = 2303
+ _ANALYZEPLANREQUEST_SAMESEMANTICS._serialized_end = 2424
+ _ANALYZEPLANREQUEST_SEMANTICHASH._serialized_start = 2426
+ _ANALYZEPLANREQUEST_SEMANTICHASH._serialized_end = 2481
+ _ANALYZEPLANREQUEST_PERSIST._serialized_start = 2484
+ _ANALYZEPLANREQUEST_PERSIST._serialized_end = 2635
+ _ANALYZEPLANREQUEST_UNPERSIST._serialized_start = 2637
+ _ANALYZEPLANREQUEST_UNPERSIST._serialized_end = 2747
+ _ANALYZEPLANREQUEST_GETSTORAGELEVEL._serialized_start = 2749
+ _ANALYZEPLANREQUEST_GETSTORAGELEVEL._serialized_end = 2819
+ _ANALYZEPLANRESPONSE._serialized_start = 2849
+ _ANALYZEPLANRESPONSE._serialized_end = 4538
+ _ANALYZEPLANRESPONSE_SCHEMA._serialized_start = 3957
+ _ANALYZEPLANRESPONSE_SCHEMA._serialized_end = 4014
+ _ANALYZEPLANRESPONSE_EXPLAIN._serialized_start = 4016
+ _ANALYZEPLANRESPONSE_EXPLAIN._serialized_end = 4064
+ _ANALYZEPLANRESPONSE_TREESTRING._serialized_start = 4066
+ _ANALYZEPLANRESPONSE_TREESTRING._serialized_end = 4111
+ _ANALYZEPLANRESPONSE_ISLOCAL._serialized_start = 4113
+ _ANALYZEPLANRESPONSE_ISLOCAL._serialized_end = 4149
+ _ANALYZEPLANRESPONSE_ISSTREAMING._serialized_start = 4151
+ _ANALYZEPLANRESPONSE_ISSTREAMING._serialized_end = 4199
+ _ANALYZEPLANRESPONSE_INPUTFILES._serialized_start = 4201
+ _ANALYZEPLANRESPONSE_INPUTFILES._serialized_end = 4235
+ _ANALYZEPLANRESPONSE_SPARKVERSION._serialized_start = 4237
+ _ANALYZEPLANRESPONSE_SPARKVERSION._serialized_end = 4277
+ _ANALYZEPLANRESPONSE_DDLPARSE._serialized_start = 4279
+ _ANALYZEPLANRESPONSE_DDLPARSE._serialized_end = 4338
+ _ANALYZEPLANRESPONSE_SAMESEMANTICS._serialized_start = 4340
+ _ANALYZEPLANRESPONSE_SAMESEMANTICS._serialized_end = 4379
+ _ANALYZEPLANRESPONSE_SEMANTICHASH._serialized_start = 4381
+ _ANALYZEPLANRESPONSE_SEMANTICHASH._serialized_end = 4419
+ _ANALYZEPLANRESPONSE_PERSIST._serialized_start = 2484
+ _ANALYZEPLANRESPONSE_PERSIST._serialized_end = 2493
+ _ANALYZEPLANRESPONSE_UNPERSIST._serialized_start = 2637
+ _ANALYZEPLANRESPONSE_UNPERSIST._serialized_end = 2648
+ _ANALYZEPLANRESPONSE_GETSTORAGELEVEL._serialized_start = 4445
+ _ANALYZEPLANRESPONSE_GETSTORAGELEVEL._serialized_end = 4528
+ _EXECUTEPLANREQUEST._serialized_start = 4541
+ _EXECUTEPLANREQUEST._serialized_end = 4750
+ _EXECUTEPLANRESPONSE._serialized_start = 4753
+ _EXECUTEPLANRESPONSE._serialized_end = 6028
+ _EXECUTEPLANRESPONSE_SQLCOMMANDRESULT._serialized_start = 5259
+ _EXECUTEPLANRESPONSE_SQLCOMMANDRESULT._serialized_end = 5330
+ _EXECUTEPLANRESPONSE_ARROWBATCH._serialized_start = 5332
+ _EXECUTEPLANRESPONSE_ARROWBATCH._serialized_end = 5393
+ _EXECUTEPLANRESPONSE_METRICS._serialized_start = 5396
+ _EXECUTEPLANRESPONSE_METRICS._serialized_end = 5913
+ _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT._serialized_start = 5491
+ _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT._serialized_end = 5823
+ _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT_EXECUTIONMETRICSENTRY._serialized_start = 5700
+ _EXECUTEPLANRESPONSE_METRICS_METRICOBJECT_EXECUTIONMETRICSENTRY._serialized_end = 5823
+ _EXECUTEPLANRESPONSE_METRICS_METRICVALUE._serialized_start = 5825
+ _EXECUTEPLANRESPONSE_METRICS_METRICVALUE._serialized_end = 5913
+ _EXECUTEPLANRESPONSE_OBSERVEDMETRICS._serialized_start = 5915
+ _EXECUTEPLANRESPONSE_OBSERVEDMETRICS._serialized_end = 6011
+ _KEYVALUE._serialized_start = 6030
+ _KEYVALUE._serialized_end = 6095
+ _CONFIGREQUEST._serialized_start = 6098
+ _CONFIGREQUEST._serialized_end = 7126
+ _CONFIGREQUEST_OPERATION._serialized_start = 6318
+ _CONFIGREQUEST_OPERATION._serialized_end = 6816
+ _CONFIGREQUEST_SET._serialized_start = 6818
+ _CONFIGREQUEST_SET._serialized_end = 6870
+ _CONFIGREQUEST_GET._serialized_start = 6872
+ _CONFIGREQUEST_GET._serialized_end = 6897
+ _CONFIGREQUEST_GETWITHDEFAULT._serialized_start = 6899
+ _CONFIGREQUEST_GETWITHDEFAULT._serialized_end = 6962
+ _CONFIGREQUEST_GETOPTION._serialized_start = 6964
+ _CONFIGREQUEST_GETOPTION._serialized_end = 6995
+ _CONFIGREQUEST_GETALL._serialized_start = 6997
+ _CONFIGREQUEST_GETALL._serialized_end = 7045
+ _CONFIGREQUEST_UNSET._serialized_start = 7047
+ _CONFIGREQUEST_UNSET._serialized_end = 7074
+ _CONFIGREQUEST_ISMODIFIABLE._serialized_start = 7076
+ _CONFIGREQUEST_ISMODIFIABLE._serialized_end = 7110
+ _CONFIGRESPONSE._serialized_start = 7128
+ _CONFIGRESPONSE._serialized_end = 7250
+ _ADDARTIFACTSREQUEST._serialized_start = 7253
+ _ADDARTIFACTSREQUEST._serialized_end = 8124
+ _ADDARTIFACTSREQUEST_ARTIFACTCHUNK._serialized_start = 7640
+ _ADDARTIFACTSREQUEST_ARTIFACTCHUNK._serialized_end = 7693
+ _ADDARTIFACTSREQUEST_SINGLECHUNKARTIFACT._serialized_start = 7695
+ _ADDARTIFACTSREQUEST_SINGLECHUNKARTIFACT._serialized_end = 7806
+ _ADDARTIFACTSREQUEST_BATCH._serialized_start = 7808
+ _ADDARTIFACTSREQUEST_BATCH._serialized_end = 7901
+ _ADDARTIFACTSREQUEST_BEGINCHUNKEDARTIFACT._serialized_start = 7904
+ _ADDARTIFACTSREQUEST_BEGINCHUNKEDARTIFACT._serialized_end = 8097
+ _ADDARTIFACTSRESPONSE._serialized_start = 8127
+ _ADDARTIFACTSRESPONSE._serialized_end = 8315
+ _ADDARTIFACTSRESPONSE_ARTIFACTSUMMARY._serialized_start = 8234
+ _ADDARTIFACTSRESPONSE_ARTIFACTSUMMARY._serialized_end = 8315
+ _SPARKCONNECTSERVICE._serialized_start = 8318
+ _SPARKCONNECTSERVICE._serialized_end = 8683
# @@protoc_insertion_point(module_scope)
diff --git a/python/pyspark/sql/connect/proto/base_pb2.pyi b/python/pyspark/sql/connect/proto/base_pb2.pyi
index 8c1f9f09d61..1a8661cdd44 100644
--- a/python/pyspark/sql/connect/proto/base_pb2.pyi
+++ b/python/pyspark/sql/connect/proto/base_pb2.pyi
@@ -41,6 +41,7 @@ import google.protobuf.internal.containers
import google.protobuf.internal.enum_type_wrapper
import google.protobuf.message
import pyspark.sql.connect.proto.commands_pb2
+import pyspark.sql.connect.proto.common_pb2
import pyspark.sql.connect.proto.expressions_pb2
import pyspark.sql.connect.proto.relations_pb2
import pyspark.sql.connect.proto.types_pb2
@@ -132,53 +133,6 @@ class UserContext(google.protobuf.message.Message):
global___UserContext = UserContext
-class StorageLevel(google.protobuf.message.Message):
- """StorageLevel for persisting Datasets/Tables."""
-
- DESCRIPTOR: google.protobuf.descriptor.Descriptor
-
- USE_DISK_FIELD_NUMBER: builtins.int
- USE_MEMORY_FIELD_NUMBER: builtins.int
- USE_OFF_HEAP_FIELD_NUMBER: builtins.int
- DESERIALIZED_FIELD_NUMBER: builtins.int
- REPLICATION_FIELD_NUMBER: builtins.int
- use_disk: builtins.bool
- """(Required) Whether the cache should use disk or not."""
- use_memory: builtins.bool
- """(Required) Whether the cache should use memory or not."""
- use_off_heap: builtins.bool
- """(Required) Whether the cache should use off-heap or not."""
- deserialized: builtins.bool
- """(Required) Whether the cached data is deserialized or not."""
- replication: builtins.int
- """(Required) The number of replicas."""
- def __init__(
- self,
- *,
- use_disk: builtins.bool = ...,
- use_memory: builtins.bool = ...,
- use_off_heap: builtins.bool = ...,
- deserialized: builtins.bool = ...,
- replication: builtins.int = ...,
- ) -> None: ...
- def ClearField(
- self,
- field_name: typing_extensions.Literal[
- "deserialized",
- b"deserialized",
- "replication",
- b"replication",
- "use_disk",
- b"use_disk",
- "use_memory",
- b"use_memory",
- "use_off_heap",
- b"use_off_heap",
- ],
- ) -> None: ...
-
-global___StorageLevel = StorageLevel
-
class AnalyzePlanRequest(google.protobuf.message.Message):
"""Request to perform plan analyze, optionally to explain the plan."""
@@ -423,13 +377,13 @@ class AnalyzePlanRequest(google.protobuf.message.Message):
def relation(self) -> pyspark.sql.connect.proto.relations_pb2.Relation:
"""(Required) The logical plan to persist."""
@property
- def storage_level(self) -> global___StorageLevel:
+ def storage_level(self) -> pyspark.sql.connect.proto.common_pb2.StorageLevel:
"""(Optional) The storage level."""
def __init__(
self,
*,
relation: pyspark.sql.connect.proto.relations_pb2.Relation | None = ...,
- storage_level: global___StorageLevel | None = ...,
+ storage_level: pyspark.sql.connect.proto.common_pb2.StorageLevel | None = ...,
) -> None: ...
def HasField(
self,
@@ -866,12 +820,12 @@ class AnalyzePlanResponse(google.protobuf.message.Message):
STORAGE_LEVEL_FIELD_NUMBER: builtins.int
@property
- def storage_level(self) -> global___StorageLevel:
+ def storage_level(self) -> pyspark.sql.connect.proto.common_pb2.StorageLevel:
"""(Required) The StorageLevel as a result of get_storage_level request."""
def __init__(
self,
*,
- storage_level: global___StorageLevel | None = ...,
+ storage_level: pyspark.sql.connect.proto.common_pb2.StorageLevel | None = ...,
) -> None: ...
def HasField(
self, field_name: typing_extensions.Literal["storage_level", b"storage_level"]
diff --git a/python/pyspark/sql/connect/proto/common_pb2.py b/python/pyspark/sql/connect/proto/common_pb2.py
new file mode 100644
index 00000000000..b65b54cc0c9
--- /dev/null
+++ b/python/pyspark/sql/connect/proto/common_pb2.py
@@ -0,0 +1,55 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# -*- coding: utf-8 -*-
+# Generated by the protocol buffer compiler. DO NOT EDIT!
+# source: spark/connect/common.proto
+"""Generated protocol buffer code."""
+from google.protobuf import descriptor as _descriptor
+from google.protobuf import descriptor_pool as _descriptor_pool
+from google.protobuf import message as _message
+from google.protobuf import reflection as _reflection
+from google.protobuf import symbol_database as _symbol_database
+
+# @@protoc_insertion_point(imports)
+
+_sym_db = _symbol_database.Default()
+
+
+DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
+ b'\n\x1aspark/connect/common.proto\x12\rspark.connect"\xb0\x01\n\x0cStorageLevel\x12\x19\n\x08use_disk\x18\x01 \x01(\x08R\x07useDisk\x12\x1d\n\nuse_memory\x18\x02 \x01(\x08R\tuseMemory\x12 \n\x0cuse_off_heap\x18\x03 \x01(\x08R\nuseOffHeap\x12"\n\x0c\x64\x65serialized\x18\x04 \x01(\x08R\x0c\x64\x65serialized\x12 \n\x0breplication\x18\x05 \x01(\x05R\x0breplicationB"\n\x1eorg.apache.spark.connect.protoP\x01\x62\x06proto3'
+)
+
+
+_STORAGELEVEL = DESCRIPTOR.message_types_by_name["StorageLevel"]
+StorageLevel = _reflection.GeneratedProtocolMessageType(
+ "StorageLevel",
+ (_message.Message,),
+ {
+ "DESCRIPTOR": _STORAGELEVEL,
+ "__module__": "spark.connect.common_pb2"
+ # @@protoc_insertion_point(class_scope:spark.connect.StorageLevel)
+ },
+)
+_sym_db.RegisterMessage(StorageLevel)
+
+if _descriptor._USE_C_DESCRIPTORS == False:
+
+ DESCRIPTOR._options = None
+ DESCRIPTOR._serialized_options = b"\n\036org.apache.spark.connect.protoP\001"
+ _STORAGELEVEL._serialized_start = 46
+ _STORAGELEVEL._serialized_end = 222
+# @@protoc_insertion_point(module_scope)
diff --git a/python/pyspark/sql/connect/proto/common_pb2.pyi b/python/pyspark/sql/connect/proto/common_pb2.pyi
new file mode 100644
index 00000000000..2a8fef7c766
--- /dev/null
+++ b/python/pyspark/sql/connect/proto/common_pb2.pyi
@@ -0,0 +1,93 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+"""
+@generated by mypy-protobuf. Do not edit manually!
+isort:skip_file
+
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements. See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import builtins
+import google.protobuf.descriptor
+import google.protobuf.message
+import sys
+
+if sys.version_info >= (3, 8):
+ import typing as typing_extensions
+else:
+ import typing_extensions
+
+DESCRIPTOR: google.protobuf.descriptor.FileDescriptor
+
+class StorageLevel(google.protobuf.message.Message):
+ """StorageLevel for persisting Datasets/Tables."""
+
+ DESCRIPTOR: google.protobuf.descriptor.Descriptor
+
+ USE_DISK_FIELD_NUMBER: builtins.int
+ USE_MEMORY_FIELD_NUMBER: builtins.int
+ USE_OFF_HEAP_FIELD_NUMBER: builtins.int
+ DESERIALIZED_FIELD_NUMBER: builtins.int
+ REPLICATION_FIELD_NUMBER: builtins.int
+ use_disk: builtins.bool
+ """(Required) Whether the cache should use disk or not."""
+ use_memory: builtins.bool
+ """(Required) Whether the cache should use memory or not."""
+ use_off_heap: builtins.bool
+ """(Required) Whether the cache should use off-heap or not."""
+ deserialized: builtins.bool
+ """(Required) Whether the cached data is deserialized or not."""
+ replication: builtins.int
+ """(Required) The number of replicas."""
+ def __init__(
+ self,
+ *,
+ use_disk: builtins.bool = ...,
+ use_memory: builtins.bool = ...,
+ use_off_heap: builtins.bool = ...,
+ deserialized: builtins.bool = ...,
+ replication: builtins.int = ...,
+ ) -> None: ...
+ def ClearField(
+ self,
+ field_name: typing_extensions.Literal[
+ "deserialized",
+ b"deserialized",
+ "replication",
+ b"replication",
+ "use_disk",
+ b"use_disk",
+ "use_memory",
+ b"use_memory",
+ "use_off_heap",
+ b"use_off_heap",
+ ],
+ ) -> None: ...
+
+global___StorageLevel = StorageLevel
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org