You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2020/03/30 19:58:47 UTC

[GitHub] [beam] robertwb opened a new pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

robertwb opened a new pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264
 
 
   This may be a more natural interface for mixing pandas-style code with Beam.
       
   It does have the downside of being inefficient if dataframes are converted back to PCollections one at a time.
   
   Builds on https://github.com/apache/beam/pull/10760
   
   ------------------------
   
   Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
   
    - [ ] [**Choose reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and mention them in a comment (`R: @username`).
    - [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA issue, if applicable. This will automatically link the pull request to the issue.
    - [ ] Update `CHANGES.md` with noteworthy changes.
    - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more tips on [how to make review process smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/) | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/)
   XLang | --- | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/)
   
   Pre-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   --- |Java | Python | Go | Website
   --- | --- | --- | --- | ---
   Non-portable | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/) 
   Portable | --- | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/) | --- | ---
   
   See [.test-infra/jenkins/README](https://github.com/apache/beam/blob/master/.test-infra/jenkins/README.md) for trigger phrase, status and link of all Jenkins jobs.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] robertwb commented on issue #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
robertwb commented on issue #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#issuecomment-613592980
 
 
   Run Python PreCommit

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r404210149
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/convert.py
 ##########
 @@ -0,0 +1,71 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import inspect
+
+from apache_beam import pvalue
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import transforms
+
+
+def to_dataframe(pc):
+  pass
+
+
+# TODO: Or should this be called as_dataframe?
 
 Review comment:
   I like `as_dataframe` if it were a method on `PCollection` since it's fluent - `df = pcol.as_dataframe()`
   
   Similarly, below `from_dataframe` would be fluent as a static method on `PCollection`, but `PCollection.from_dataframe` feels too verbose.
   
   Would you agree we want something like the following to be as easy and intuitive as possible?
   ```py
   pcol = p | "Read from Source" >> beam.io.SomeSchemaSource(foo)
   
   df = pcol.as_dataframe()
   df_agg = df[df["measurement" > threshold]].groupby("id").count()
   
   PCollection.from_dataframe(df_agg) | "Write to Sink" >> beam.io.SomeSink(bar)
   ```
   
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r404212382
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/transforms.py
 ##########
 @@ -0,0 +1,255 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import pandas as pd
+
+import apache_beam as beam
+from apache_beam import transforms
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import frames  # pylint: disable=unused-import
+
+
+class DataframeTransform(transforms.PTransform):
+  """A PTransform for applying function that takes and returns dataframes
+  to one or more PCollections.
+
+  For example, if pcoll is a PCollection of dataframes, one could write::
+
+      pcoll | DataframeTransform(lambda df: df.group_by('key').sum(), proxy=...)
+
+  To pass multiple PCollections, pass a tuple of PCollections wich will be
+  passed to the callable as positional arguments, or a dictionary of
+  PCollections, in which case they will be passed as keyword arguments.
+  """
+  def __init__(self, func, proxy):
+    self._func = func
+    self._proxy = proxy
+
+  def expand(self, input_pcolls):
+    def wrap_as_dict(values):
+      if isinstance(values, dict):
+        return values
+      elif isinstance(values, tuple):
+        return dict(enumerate(values))
+      else:
+        return {None: values}
+
+    # TODO: Infer the proxy from the input schema.
+    def proxy(key):
+      if key is None:
+        return self._proxy
+      else:
+        return self._proxy[key]
+
+    # The input can be a dictionary, tuple, or plain PCollection.
+    # Wrap as a dict for homogeneity.
+    # TODO: Possibly inject batching here.
+    input_dict = wrap_as_dict(input_pcolls)
+    placeholders = {
+        key: frame_base.DeferredFrame.wrap(
+            expressions.PlaceholderExpression(proxy(key)))
+        for key in input_dict.keys()
+    }
+
+    # The calling convention of the user-supplied func varies according to the
+    # type of the input.
+    if isinstance(input_pcolls, dict):
+      result_frames = self._func(**placeholders)
+    elif isinstance(input_pcolls, tuple):
+      result_frames = self._func(
+          *(value for _, value in sorted(placeholders.items())))
+    else:
+      result_frames = self._func(placeholders[None])
+
+    # Likewise the output may be a dict, tuple, or raw (deferred) Dataframe.
+    result_dict = wrap_as_dict(result_frames)
+
+    result_pcolls = {
+        placeholders[key]._expr: pcoll
+        for key, pcoll in input_dict.items()
+    } | 'Eval' >> DataframeExpressionsTransform(
+        {key: df._expr
+         for key, df in result_dict.items()})
+
+    # Convert the result back into a set of PCollections.
+    if isinstance(result_frames, dict):
+      return result_pcolls
+    elif isinstance(result_frames, tuple):
+      return tuple((value for _, value in sorted(result_pcolls.items())))
+    else:
+      return result_pcolls[None]
 
 Review comment:
   Could this be streamlined with calls to `to_dataframe` and `to_pcollection`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] robertwb commented on issue #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
robertwb commented on issue #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#issuecomment-613593039
 
 
   @TheNeuralBit PTAL

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r404200439
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/transforms.py
 ##########
 @@ -0,0 +1,255 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import pandas as pd
+
+import apache_beam as beam
+from apache_beam import transforms
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import frames  # pylint: disable=unused-import
+
+
+class DataframeTransform(transforms.PTransform):
+  """A PTransform for applying function that takes and returns dataframes
+  to one or more PCollections.
+
+  For example, if pcoll is a PCollection of dataframes, one could write::
+
+      pcoll | DataframeTransform(lambda df: df.group_by('key').sum(), proxy=...)
+
+  To pass multiple PCollections, pass a tuple of PCollections wich will be
+  passed to the callable as positional arguments, or a dictionary of
+  PCollections, in which case they will be passed as keyword arguments.
+  """
+  def __init__(self, func, proxy):
+    self._func = func
+    self._proxy = proxy
+
+  def expand(self, input_pcolls):
+    def wrap_as_dict(values):
+      if isinstance(values, dict):
+        return values
+      elif isinstance(values, tuple):
+        return dict(enumerate(values))
+      else:
+        return {None: values}
+
+    # TODO: Infer the proxy from the input schema.
+    def proxy(key):
+      if key is None:
+        return self._proxy
+      else:
+        return self._proxy[key]
+
+    # The input can be a dictionary, tuple, or plain PCollection.
+    # Wrap as a dict for homogeneity.
+    # TODO: Possibly inject batching here.
+    input_dict = wrap_as_dict(input_pcolls)
+    placeholders = {
+        key: frame_base.DeferredFrame.wrap(
+            expressions.PlaceholderExpression(proxy(key)))
+        for key in input_dict.keys()
+    }
+
+    # The calling convention of the user-supplied func varies according to the
+    # type of the input.
+    if isinstance(input_pcolls, dict):
+      result_frames = self._func(**placeholders)
+    elif isinstance(input_pcolls, tuple):
+      result_frames = self._func(
+          *(value for _, value in sorted(placeholders.items())))
+    else:
+      result_frames = self._func(placeholders[None])
+
+    # Likewise the output may be a dict, tuple, or raw (deferred) Dataframe.
+    result_dict = wrap_as_dict(result_frames)
+
+    result_pcolls = {
+        placeholders[key]._expr: pcoll
+        for key, pcoll in input_dict.items()
+    } | 'Eval' >> DataframeExpressionsTransform(
+        {key: df._expr
+         for key, df in result_dict.items()})
+
+    # Convert the result back into a set of PCollections.
+    if isinstance(result_frames, dict):
+      return result_pcolls
+    elif isinstance(result_frames, tuple):
+      return tuple((value for _, value in sorted(result_pcolls.items())))
+    else:
+      return result_pcolls[None]
+
+
+class DataframeExpressionsTransform(transforms.PTransform):
 
 Review comment:
   This looks like something that should be "package-private". Should it get an underscore prefix?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] robertwb commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
robertwb commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r407745027
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/transforms.py
 ##########
 @@ -0,0 +1,255 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import pandas as pd
+
+import apache_beam as beam
+from apache_beam import transforms
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import frames  # pylint: disable=unused-import
+
+
+class DataframeTransform(transforms.PTransform):
+  """A PTransform for applying function that takes and returns dataframes
+  to one or more PCollections.
+
+  For example, if pcoll is a PCollection of dataframes, one could write::
+
+      pcoll | DataframeTransform(lambda df: df.group_by('key').sum(), proxy=...)
+
+  To pass multiple PCollections, pass a tuple of PCollections wich will be
+  passed to the callable as positional arguments, or a dictionary of
+  PCollections, in which case they will be passed as keyword arguments.
+  """
+  def __init__(self, func, proxy):
+    self._func = func
+    self._proxy = proxy
+
+  def expand(self, input_pcolls):
+    def wrap_as_dict(values):
+      if isinstance(values, dict):
+        return values
+      elif isinstance(values, tuple):
+        return dict(enumerate(values))
+      else:
+        return {None: values}
+
+    # TODO: Infer the proxy from the input schema.
+    def proxy(key):
+      if key is None:
+        return self._proxy
+      else:
+        return self._proxy[key]
+
+    # The input can be a dictionary, tuple, or plain PCollection.
+    # Wrap as a dict for homogeneity.
+    # TODO: Possibly inject batching here.
+    input_dict = wrap_as_dict(input_pcolls)
+    placeholders = {
+        key: frame_base.DeferredFrame.wrap(
+            expressions.PlaceholderExpression(proxy(key)))
+        for key in input_dict.keys()
+    }
+
+    # The calling convention of the user-supplied func varies according to the
+    # type of the input.
+    if isinstance(input_pcolls, dict):
+      result_frames = self._func(**placeholders)
+    elif isinstance(input_pcolls, tuple):
+      result_frames = self._func(
+          *(value for _, value in sorted(placeholders.items())))
+    else:
+      result_frames = self._func(placeholders[None])
+
+    # Likewise the output may be a dict, tuple, or raw (deferred) Dataframe.
+    result_dict = wrap_as_dict(result_frames)
+
+    result_pcolls = {
+        placeholders[key]._expr: pcoll
+        for key, pcoll in input_dict.items()
+    } | 'Eval' >> DataframeExpressionsTransform(
+        {key: df._expr
+         for key, df in result_dict.items()})
+
+    # Convert the result back into a set of PCollections.
+    if isinstance(result_frames, dict):
+      return result_pcolls
+    elif isinstance(result_frames, tuple):
+      return tuple((value for _, value in sorted(result_pcolls.items())))
+    else:
+      return result_pcolls[None]
 
 Review comment:
   Yes. Done.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r404210590
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/convert.py
 ##########
 @@ -0,0 +1,71 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import inspect
+
+from apache_beam import pvalue
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import transforms
+
+
+def to_dataframe(pc):
+  pass
+
+
+# TODO: Or should this be called as_dataframe?
+def to_dataframe(pcoll, proxy):
+  return frame_base.DeferredFrame.wrap(
+      expressions.PlaceholderExpression(proxy, pcoll))
+
+
+# TODO: Or should this be called from_dataframe?
+def to_pcollection(*dataframes, **kwargs):
 
 Review comment:
   Could you add typehints and docstrings here and on to_dataframe?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
TheNeuralBit commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r404195774
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/convert.py
 ##########
 @@ -0,0 +1,71 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import inspect
+
+from apache_beam import pvalue
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import transforms
+
+
+def to_dataframe(pc):
+  pass
+
+
+# TODO: Or should this be called as_dataframe?
+def to_dataframe(pcoll, proxy):
+  return frame_base.DeferredFrame.wrap(
+      expressions.PlaceholderExpression(proxy, pcoll))
+
+
+# TODO: Or should this be called from_dataframe?
+def to_pcollection(*dataframes, **kwargs):
+  label = kwargs.pop('label', None)
+  assert not kwargs  # TODO(Py3): Use PEP 3102
+  if label is None:
+    # Attempt to come up with a reasonable, stable label.
 
 Review comment:
   Could you add something like ".. by retrieving the name of these variables in the calling context"

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [beam] robertwb commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.

Posted by GitBox <gi...@apache.org>.
robertwb commented on a change in pull request #11264: [BEAM-9496] Add to_dataframe and to_pcollection APIs.
URL: https://github.com/apache/beam/pull/11264#discussion_r407749163
 
 

 ##########
 File path: sdks/python/apache_beam/dataframe/convert.py
 ##########
 @@ -0,0 +1,71 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+
+import inspect
+
+from apache_beam import pvalue
+from apache_beam.dataframe import expressions
+from apache_beam.dataframe import frame_base
+from apache_beam.dataframe import transforms
+
+
+def to_dataframe(pc):
+  pass
+
+
+# TODO: Or should this be called as_dataframe?
 
 Review comment:
   So far we haven't added any methods to PCollection, but I'm open to the idea (thought it'd be a big change to the API that should be done wholistically, see https://lists.apache.org/thread.html/fcb422d61437a634662b24100d4e2d46a940ee766848b699023081d9%40%3Cdev.beam.apache.org%3E ) For now, at least, it seems a bit much to make dataframe-methods on PCollection itself. 
   
   Short of that, can you think of any fluent styles for
   
   ```
   from apache_beam import dataframe as ???
   ...
   pcol = p | "Read from Source" >> beam.io.SomeSchemaSource(foo)
   df = ???
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services