You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/03 13:30:02 UTC

[GitHub] [spark] srowen commented on a change in pull request #29639: [SPARK-32186][DOCS][PYTHON] User Guide - Debugging

srowen commented on a change in pull request #29639:
URL: https://github.com/apache/spark/pull/29639#discussion_r482975706



##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.

Review comment:
       Maybe: "This section describes remote debugging within a single machine .."

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction

Review comment:
       Slightly better is "If a PySpark application does not require interaction ..."

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 

Review comment:
       PyCharm, documented here

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.

Review comment:
       Choose Edit Configuration from the Run menu
   manu -> menu
   
   Start a new sentence. "It opens the Run/Debug ..."

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific

Review comment:
       on other machines

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+After that, you should install the corresponding version of ``pydevd-pycahrm`` package. In the previous dialog, it shows the command
+to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directly, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python workers in your PySpark applications by using ``spark.python.daemon.module`` configuration.

Review comment:
       workers -> worker
   using the ... configuration

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+After that, you should install the corresponding version of ``pydevd-pycahrm`` package. In the previous dialog, it shows the command
+to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directly, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python workers in your PySpark applications by using ``spark.python.daemon.module`` configuration.
+Run ``pyspark`` shell with the configuration below:
+
+.. code-block:: bash
+
+    pyspark --conf spark.python.daemon.module=remote_debug
+
+Now you're ready to remote debug. Start debug with your ``MyRemoteDebugger``.

Review comment:
       Start debugging

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+After that, you should install the corresponding version of ``pydevd-pycahrm`` package. In the previous dialog, it shows the command
+to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directly, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python workers in your PySpark applications by using ``spark.python.daemon.module`` configuration.
+Run ``pyspark`` shell with the configuration below:

Review comment:
       Run the pyspark shell with ... , or Run pyspark with

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+After that, you should install the corresponding version of ``pydevd-pycahrm`` package. In the previous dialog, it shows the command
+to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directly, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python workers in your PySpark applications by using ``spark.python.daemon.module`` configuration.
+Run ``pyspark`` shell with the configuration below:
+
+.. code-block:: bash
+
+    pyspark --conf spark.python.daemon.module=remote_debug
+
+Now you're ready to remote debug. Start debug with your ``MyRemoteDebugger``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug2.png
+    :alt: PyCharm run remote debugger
+
+After that, run a job that creates a Python workers, for example, as below:
+
+.. code-block:: python
+
+    spark.range(10).repartition(1).rdd.map(lambda x: x).collect()
+
+
+Checking Memory and CPU Usage
+-----------------------------
+
+Python workers are typically monitored via ``top`` and ``ps`` commands because Python workers launch multiple Python

Review comment:
       Python workers create multiple Python processes

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+After that, you should install the corresponding version of ``pydevd-pycahrm`` package. In the previous dialog, it shows the command
+to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+In your current working directly, prepare a Python file as below:
+
+.. code-block:: bash
+
+    echo "from pyspark import daemon, worker
+    def remote_debug_wrapped(*args, **kwargs):
+        #======================Copy and paste from the previous dialog===========================
+        import pydevd_pycharm
+        pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)
+        #========================================================================================
+        worker.main(*args, **kwargs)
+    daemon.worker_main = remote_debug_wrapped
+    if __name__ == '__main__':
+        daemon.manager()" > remote_debug.py
+
+You will use this file as the Python workers in your PySpark applications by using ``spark.python.daemon.module`` configuration.
+Run ``pyspark`` shell with the configuration below:
+
+.. code-block:: bash
+
+    pyspark --conf spark.python.daemon.module=remote_debug
+
+Now you're ready to remote debug. Start debug with your ``MyRemoteDebugger``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug2.png
+    :alt: PyCharm run remote debugger
+
+After that, run a job that creates a Python workers, for example, as below:

Review comment:
       creates Python workers

##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,188 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. In case that PySpark applications do not require the interaction
+between Python workers and JVMs, Python workers are not launched. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you execute pandas UDFs or
+PySpark RDD APIs.
+
+This page describes how to debug such Python workers instead of focusing on debugging with JVM.
+Profiling and debugging JVM is described at `Useful Developer Tools <https://spark.apache.org/developer-tools.html>`_.
+
+
+Remote Debugging (PyCharm)
+--------------------------
+
+In order to debug the Python workers remotely, you should connect from the Python worker to the debug server in PyCharm.
+In this section, it describes the remote debug within single machine to demonstrate easily.
+In order to debug PySpark applications in other machines, please refer to the full instructions that are specific
+to PyCharm is documented at `here <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config>`_. 
+
+Firstly, choose **Run | Edit Configuration...** in the main manu, and it opens the Run/debug configurations dialog.
+You have to click ``+`` configuration on the toolbar, and from the list of available configurations, select **Python Debug Server**.
+Enter the name of this run/debug configuration, for example, ``MyRemoteDebugger`` and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+After that, you should install the corresponding version of ``pydevd-pycahrm`` package. In the previous dialog, it shows the command

Review comment:
       of the ... package




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org