You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/04 03:41:14 UTC

[GitHub] [spark] viirya commented on a change in pull request #29639: [SPARK-32186][DOCS][PYTHON] User Guide - Debugging

viirya commented on a change in pull request #29639:
URL: https://github.com/apache/spark/pull/29639#discussion_r483366839



##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,187 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. If a PySpark application does not require interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python applications and workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+This section describes remote debugging within a single machine to demonstrate easily.
+In order to debug PySpark applications on other machines, please refer to the full instructions that are specific
+to PyCharm, documented `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Edit Configuration...** from the **Run** menu. It opens the Run/debug Configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+| After that, you should install the corresponding version of the ``pydevd-pycahrm`` package. In the previous dialog, it shows the command to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directory, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python worker in your PySpark applications by using the ``spark.python.daemon.module`` configuration.
+Run the ``pyspark`` shell with the configuration below:
+
+.. code-block:: bash
+
+    pyspark --conf spark.python.daemon.module=remote_debug
+
+Now you're ready to remote debug. Start debugging with your ``MyRemoteDebugger``.

Review comment:
       remote debug -> remotely debug

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,187 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. If a PySpark application does not require interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python applications and workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+This section describes remote debugging within a single machine to demonstrate easily.
+In order to debug PySpark applications on other machines, please refer to the full instructions that are specific
+to PyCharm, documented `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Edit Configuration...** from the **Run** menu. It opens the Run/debug Configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+| After that, you should install the corresponding version of the ``pydevd-pycahrm`` package. In the previous dialog, it shows the command to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directory, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python worker in your PySpark applications by using the ``spark.python.daemon.module`` configuration.
+Run the ``pyspark`` shell with the configuration below:
+
+.. code-block:: bash
+
+    pyspark --conf spark.python.daemon.module=remote_debug
+
+Now you're ready to remote debug. Start debugging with your ``MyRemoteDebugger``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug2.png
+    :alt: PyCharm run remote debugger
+
+| After that, run a job that creates Python workers, for example, as below:
+
+.. code-block:: python
+
+    spark.range(10).repartition(1).rdd.map(lambda x: x).collect()
+
+
+Checking Memory and CPU Usage
+-----------------------------
+
+Python workers are typically monitored via ``top`` and ``ps`` commands because Python workers create multiple Python processes
+workers are created as processes. As an example, you can ``ps`` as below:

Review comment:
       Is `workers are created as processes` redundant?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org