You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Chamikara Jayalath (Jira)" <ji...@apache.org> on 2019/09/16 16:37:00 UTC

[jira] [Commented] (BEAM-8240) SDK Harness

    [ https://issues.apache.org/jira/browse/BEAM-8240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16930682#comment-16930682 ] 

Chamikara Jayalath commented on BEAM-8240:
------------------------------------------

In the cross-language path, there will be more than one SDK container, so I think the correct solution is to update Dataflow to set a pointer to the container image in the environment payload (of transforms) and determine the set of containers needed within Dataflow service.

 

For customer containers, we should be updating environemt payload instead of just updating worker_harness_container_image flag.

 

cc: [~robertwb]

> SDK Harness
> -----------
>
>                 Key: BEAM-8240
>                 URL: https://issues.apache.org/jira/browse/BEAM-8240
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-py-harness
>            Reporter: Luke Cwik
>            Assignee: Luke Cwik
>            Priority: Minor
>
> SDK harness incorrectly identifies itself when using custom SDK container within environment field when building pipeline proto.
>  
> Passing in the experiment *worker_harness_container_image=YYY* doesn't override the pipeline proto environment field and it is still being populated with *gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20190802*
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)