You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/20 06:57:17 UTC

[GitHub] ScrapCodes opened a new pull request #19096: [SPARK-21869][SS] A cached Kafka producer should not be closed if any task is using it - adds inuse tracking.

ScrapCodes opened a new pull request #19096: [SPARK-21869][SS] A cached Kafka producer should not be closed if any task is using it - adds inuse tracking.
URL: https://github.com/apache/spark/pull/19096
 
 
   ## What changes were proposed in this pull request?
   We track the producer, by maintaining an `inuse` thread count of the producer. If the producer is `inuse` and we get eviction orders from guava, we move such producers to a queue(`closeQueue`) and periodically (in a non-thread way) check for its `inuse` status and eventually close it. This way a producer will not be closed while being used and also not get assigned to a new task when evicted.
   
   We had to do this because, guava has a limitation that it does not allow for custom eviction strategy. https://github.com/google/guava/issues/3013
   ## How was this patch tested?
   Updated existing and added appropriate test case.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org