You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "mohamed imran (JIRA)" <ji...@apache.org> on 2017/11/24 12:32:00 UTC

[jira] [Commented] (SPARK-22526) Document closing of PortableDataInputStream in binaryFiles

    [ https://issues.apache.org/jira/browse/SPARK-22526?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16265232#comment-16265232 ] 

mohamed imran commented on SPARK-22526:
---------------------------------------

[~stevel@apache.org] Thanks for your suggestions.

I did made change in my code as per your inputs and tested. It worked fine without any issues.

ubuntu@test.0:~$ netstat -a | grep CLOSE_WAIT | wc
      5      30     400
ubuntu@test.0:~$ netstat -a | grep CLOSE_WAIT | wc
      4      24     320
ubuntu@test.0:~$ netstat -a | grep CLOSE_WAIT | wc
     2      12     160
It is is closing the stream properly and no more hangs while reading from S3.

Sure. I want to add it in the docs with the proper syntax. How can i Edit it?

> Document closing of PortableDataInputStream in binaryFiles
> ----------------------------------------------------------
>
>                 Key: SPARK-22526
>                 URL: https://issues.apache.org/jira/browse/SPARK-22526
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation, Spark Core
>    Affects Versions: 2.2.0
>            Reporter: mohamed imran
>            Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Hi,
> I am using Spark 2.2.0(recent version) to read binary files from S3. I use sc.binaryfiles to read the files.
> It is working fine until some 100 file read but later it get hangs indefinitely from 5 up to 40 mins like Avro file read issue(it was fixed in the later releases)
> I tried setting the fs.s3a.connection.maximum to some maximum values but didn't help.
> And finally i ended up using the spark speculation parameter set which is again didnt help much. 
> One thing Which I observed is that it is not closing the connection after every read of binary files from the S3.
> example :- sc.binaryFiles("s3a://test/test123.zip")
> Please look into this major issue!      



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org