You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Henning Rohde (JIRA)" <ji...@apache.org> on 2018/11/07 20:19:00 UTC

[jira] [Assigned] (BEAM-5466) Cannot deploy job to Dataflow with RuntimeValueProvider

     [ https://issues.apache.org/jira/browse/BEAM-5466?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Henning Rohde reassigned BEAM-5466:
-----------------------------------

    Assignee:     (was: Henning Rohde)

> Cannot deploy job to Dataflow with RuntimeValueProvider
> -------------------------------------------------------
>
>                 Key: BEAM-5466
>                 URL: https://issues.apache.org/jira/browse/BEAM-5466
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-dataflow, sdk-py-core
>    Affects Versions: 2.6.0
>         Environment: Python 2.7
>            Reporter: Mackenzie
>            Priority: Major
>
> I cannot deploy an apache beam job to Cloud Dataflow that contains runtime value parameters.
> The standard use case is with Cloud Dataflow Templates which use RuntimeValueProvider to get template parameters.
> When trying to call `get` on the parameter, I always get an error like:
> {noformat}
> apache_beam.error.RuntimeValueProviderError: RuntimeValueProvider(option: myparam, type: str, default_value: 'defalut-value').get() not called from a runtime context
> {noformat}
>  
> A minimal example:
> {code:java}
> class UserOptions(PipelineOptions):
>     @classmethod
>     def _add_argparse_args(cls, parser):
>         parser.add_value_provider_argument('--myparam', type=str, default='default-value')
> def run(argv=None):
>     parser = argparse.ArgumentParser()
>     known_args, pipeline_args = parser.parse_known_args(argv)
>     pipeline_options = PipelineOptions(pipeline_args)
>     pipeline_options.view_as(SetupOptions).save_main_session = True
>     google_cloud_options = pipeline_options.view_as(GoogleCloudOptions)
>     # insert google cloud options here, or pass them in arguments
>     standard_options = pipeline_options.view_as(StandardOptions)
>     standard_options.runner = 'DataflowRunner'
>     user_options = pipeline_options.view_as(UserOptions)
>     p = beam.Pipeline(options=pipeline_options)
>     param = user_options.myparam.get() # This line is the issue
>     result = p.run()
>     result.wait_until_finish()
> if __name__ == '__main__':
>     run()
> {code}
> I would expect that the runtime context would be ignored when running the script locally.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)