You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by HyukjinKwon <gi...@git.apache.org> on 2018/09/23 02:16:09 UTC

[GitHub] spark pull request #22523: [MINOR][PYSPARK] Always Close the tempFile in _se...

Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22523#discussion_r219686544
  
    --- Diff: python/pyspark/context.py ---
    @@ -537,8 +537,10 @@ def _serialize_to_jvm(self, data, serializer, reader_func, createRDDServer):
                 # parallelize from there.
                 tempFile = NamedTemporaryFile(delete=False, dir=self._temp_dir)
    --- End diff --
    
    Actually, we better use a context manager:
    
    ```python
    with NamedTemporaryFile(delete=False, dir=self._temp_dir) as tempfile:
        ...
    ```
    
    but not a big deal. LGTM


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org