You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "ueshin (via GitHub)" <gi...@apache.org> on 2023/05/26 22:33:27 UTC

[GitHub] [spark] ueshin commented on a diff in pull request #41316: [SPARK-43798][SQL][PYTHON] Support Python user-defined table functions

ueshin commented on code in PR #41316:
URL: https://github.com/apache/spark/pull/41316#discussion_r1207428719


##########
python/pyspark/worker.py:
##########
@@ -456,6 +456,54 @@ def assign_cols_by_name(runner_conf):
     )
 
 
+def read_udtf(pickleSer, infile, eval_type):
+    num_udtfs = read_int(infile)
+    if num_udtfs != 1:
+        raise RuntimeError("Got more than 1 UDTF")
+
+    # See `PythonUDFRunner.writeUDFs`.
+    num_arg = read_int(infile)
+    arg_offsets = [read_int(infile) for _ in range(num_arg)]
+    num_chained_funcs = read_int(infile)
+    if num_chained_funcs != 1:
+        raise RuntimeError("Got more than 1 chained UDTF")
+
+    handler, return_type = read_command(pickleSer, infile)
+    if not isinstance(handler, type):
+        raise RuntimeError(f"UDTF handler must be a class, but got {type(handler)}.")
+
+    # Instantiate the UDTF class.
+    try:
+        udtf = handler()
+    except Exception as e:
+        raise RuntimeError(f"Failed to init the UDTF handler: {str(e)}") from None
+
+    # Wrap the eval method.
+    if not hasattr(udtf, "eval"):
+        raise RuntimeError("Python UDTF must implement the eval method.")
+
+    def wrap_udtf(f, return_type):
+        if return_type.needConversion():
+            toInternal = return_type.toInternal
+            return lambda *a: map(toInternal, f(*a))
+        else:
+            return lambda *a: f(*a)
+
+    f = wrap_udtf(getattr(udtf, "eval"), return_type)
+
+    def mapper(a):
+        results = tuple(f(*[a[o] for o in arg_offsets]))
+        return results
+
+    # Return an iterator of iterators.
+    def func(_, it):
+        return map(mapper, it)

Review Comment:
   I guess we can call `terminate` here, then we don't need to return `udft`?
   
   ```py
   def func(_, it):
       try:
           yield from map(mapper, it)
       finally:
           if hasattr ...
               udtf.terminate()
   ```
   
   try-finally is necessary if `terminate()` should always be called; otherwise we can omit it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org