You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2017/05/01 12:33:04 UTC

[jira] [Commented] (BEAM-2074) SourceDStream's rate control mechanism may not work

    [ https://issues.apache.org/jira/browse/BEAM-2074?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990755#comment-15990755 ] 

ASF GitHub Bot commented on BEAM-2074:
--------------------------------------

Github user asfgit closed the pull request at:

    https://github.com/apache/beam/pull/2733


> SourceDStream's rate control mechanism may not work
> ---------------------------------------------------
>
>                 Key: BEAM-2074
>                 URL: https://issues.apache.org/jira/browse/BEAM-2074
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-spark
>            Reporter: Stas Levin
>            Assignee: Stas Levin
>
> If the {{boundMaxRecords}} for {{SourceDStream}} is not set (as a result of users choosing not to specify {{SparkPipelineOptions.setMaxRecordsPerBatch()}}), {{SourceDStream}} is designed to use {{rateControlledMaxRecords()}} to determine the max number of records it should be reading.
> However, at the moment {{SourceDStream}} only consults  {{rateControlledMaxRecords()}} once, when it is created, thus following
> instantiations of {{MicrobatchSource}} will be using the now-set-in-stone {{boundMaxRecords}} value without consulting with {{rateControlledMaxRecords()}}, making the rate controlling mechanism sub-optimal.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)