You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@beam.apache.org by Maximilian Michels <mx...@apache.org> on 2018/10/09 11:45:41 UTC
Log output from Dataflow tests
Hi,
I'm debugging a test failure in Dataflow PostCommit. There are logs
available which I can't access. Is it possible to be added to the
apache-beam-testing project?
Thanks,
Max
Example:
======================================================================
FAIL: test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
line 175, in test_streaming_with_attributes
self._test_streaming(with_attributes=True)
File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
line 167, in _test_streaming
timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py",
line 91, in run_pipeline
result = p.run()
File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/pipeline.py",
line 416, in run
return self.runner.run_pipeline(self)
File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",
line 65, in run_pipeline
hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError:
Expected: (Test pipeline expected terminated in state: RUNNING and
Expected 2 messages.)
but: Expected 2 messages. Got 0 messages. Diffs (item, count):
Expected but not in actual: [(PubsubMessage(data001-seen,
{'processed': 'IT'}), 1), (PubsubMessage(data002-seen, {'timestamp_out':
'2018-07-11T02:02:50.149000Z', 'processed': 'IT'}), 1)]
Unexpected: []
Stripped attributes: ['id', 'timestamp']
-------------------- >> begin captured stdout << ---------------------
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing.
Re: Log output from Dataflow tests
Posted by Ankur Goenka <go...@google.com>.
Hi Max, I don't have edit privileges for the project so can't modify user.
On Wed, Oct 10, 2018 at 9:02 AM Maximilian Michels <mx...@apache.org> wrote:
> Thank you Scott! Ismael also sent me the logs and I could fix the error.
>
> It seems we have granted read-only access to project members in the
> past. I just checked back with Ankur, he might be able to grant access
> for my GCP account.
>
> -Max
>
> On 10.10.18 17:26, Scott Wegner wrote:
> > I'm not sure how apache-beam-testing permissions are managed; Kenn,
> > could we grant read-access for contributors who need it for testing?
> >
> > Here are two logs from the job that seem relevant:
> >
> > 2018-10-08 14:44:45.381 PDT
> > Parsing unknown args:
> > [u'--dataflowJobId=2018-10-08_14_41_03-9578125971484804239',
> > u'--autoscalingAlgorithm=NONE', u'--direct_runner_use_stacked_bundle',
> > u'--maxNumWorkers=0', u'--style=scrambled', u'--sleep_secs=20',
> > u'--pipeline_type_check',
> >
> u'--gcpTempLocation=gs://temp-storage-for-end-to-end-tests/temp-it/beamapp-jenkins-1008214058-522436.1539034858.522554',
>
> > u'--numWorkers=1',
> > u'--beam_plugins=apache_beam.io.filesystem.FileSystem',
> > u'--beam_plugins=apache_beam.io.hadoopfilesystem.HadoopFileSystem',
> > u'--beam_plugins=apache_beam.io.localfilesystem.LocalFileSystem',
> > u'--beam_plugins=apache_beam.io.gcp.gcsfilesystem.GCSFileSystem',
> > u'--beam_plugins=apache_beam.io.filesystem_test.TestingFileSystem',
> >
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.PipelineGraphRenderer',
>
> >
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.MuteRenderer',
>
> >
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.TextRenderer',
>
> >
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.PydotRenderer',
>
> >
> u'--pipelineUrl=gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1008214058-522436.1539034858.522554/pipeline.pb']
> >
> > 2018-10-08 14:44:45.382 PDT
> > Python sdk harness failed: Traceback (most recent call last): File
> >
> "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker_main.py",
>
> > line 133, in main
> > sdk_pipeline_options.get_all_options(drop_default=True)) File
> >
> "/usr/local/lib/python2.7/dist-packages/apache_beam/options/pipeline_options.py",
>
> > line 227, in get_all_options action='append' if num_times > 1 else
> > 'store') File "/usr/lib/python2.7/argparse.py", line 1308, in
> > add_argument return self._add_action(action) File
> > "/usr/lib/python2.7/argparse.py", line 1682, in _add_action
> > self._optionals._add_action(action) File
> > "/usr/lib/python2.7/argparse.py", line 1509, in _add_action action =
> > super(_ArgumentGroup, self)._add_action(action) File
> > "/usr/lib/python2.7/argparse.py", line 1322, in _add_action
> > self._check_conflict(action) File "/usr/lib/python2.7/argparse.py", line
> > 1460, in _check_conflict conflict_handler(action, confl_optionals) File
> > "/usr/lib/python2.7/argparse.py", line 1467, in _handle_conflict_error
> > raise ArgumentError(action, message % conflict_string) ArgumentError:
> > argument --beam_plugins: conflicting option string(s): --beam_plugins
> >
> > On Wed, Oct 10, 2018 at 1:05 AM Maximilian Michels <mxm@apache.org
> > <ma...@apache.org>> wrote:
> >
> > Would be great to provide access to Dataflow build logs.
> >
> > In the meantime, could someone with access send me the logs for the
> job
> > below?
> >
> >
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing
> >
> > Thanks,
> > Max
> >
> > On 09.10.18 13:45, Maximilian Michels wrote:
> > > Hi,
> > >
> > > I'm debugging a test failure in Dataflow PostCommit. There are
> logs
> > > available which I can't access. Is it possible to be added to the
> > > apache-beam-testing project?
> > >
> > > Thanks,
> > > Max
> > >
> > >
> > > Example:
> > >
> >
> ======================================================================
> > > FAIL: test_streaming_with_attributes
> > > (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
> > >
> >
> ----------------------------------------------------------------------
> > > Traceback (most recent call last):
> > > File
> > >
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
> >
> > > line 175, in test_streaming_with_attributes
> > > self._test_streaming(with_attributes=True)
> > > File
> > >
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
> >
> > > line 167, in _test_streaming
> > > timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
> > > File
> > >
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py",
> >
> > > line 91, in run_pipeline
> > > result = p.run()
> > > File
> > >
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/pipeline.py",
> >
> > > line 416, in run
> > > return self.runner.run_pipeline(self)
> > > File
> > >
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",
> >
> > > line 65, in run_pipeline
> > > hc_assert_that(self.result,
> pickler.loads(on_success_matcher))
> > > AssertionError:
> > > Expected: (Test pipeline expected terminated in state: RUNNING and
> > > Expected 2 messages.)
> > > but: Expected 2 messages. Got 0 messages. Diffs (item,
> count):
> > > Expected but not in actual: [(PubsubMessage(data001-seen,
> > > {'processed': 'IT'}), 1), (PubsubMessage(data002-seen,
> > {'timestamp_out':
> > > '2018-07-11T02:02:50.149000Z', 'processed': 'IT'}), 1)]
> > > Unexpected: []
> > > Stripped attributes: ['id', 'timestamp']
> > >
> > > -------------------- >> begin captured stdout <<
> > ---------------------
> > > Found:
> > >
> >
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing
> .
> >
> > >
> >
> >
> >
> > --
> >
> >
> >
> >
> > Got feedback? tinyurl.com/swegner-feedback
> > <https://tinyurl.com/swegner-feedback>
>
Re: Log output from Dataflow tests
Posted by Maximilian Michels <mx...@apache.org>.
Thank you Scott! Ismael also sent me the logs and I could fix the error.
It seems we have granted read-only access to project members in the
past. I just checked back with Ankur, he might be able to grant access
for my GCP account.
-Max
On 10.10.18 17:26, Scott Wegner wrote:
> I'm not sure how apache-beam-testing permissions are managed; Kenn,
> could we grant read-access for contributors who need it for testing?
>
> Here are two logs from the job that seem relevant:
>
> 2018-10-08 14:44:45.381 PDT
> Parsing unknown args:
> [u'--dataflowJobId=2018-10-08_14_41_03-9578125971484804239',
> u'--autoscalingAlgorithm=NONE', u'--direct_runner_use_stacked_bundle',
> u'--maxNumWorkers=0', u'--style=scrambled', u'--sleep_secs=20',
> u'--pipeline_type_check',
> u'--gcpTempLocation=gs://temp-storage-for-end-to-end-tests/temp-it/beamapp-jenkins-1008214058-522436.1539034858.522554',
> u'--numWorkers=1',
> u'--beam_plugins=apache_beam.io.filesystem.FileSystem',
> u'--beam_plugins=apache_beam.io.hadoopfilesystem.HadoopFileSystem',
> u'--beam_plugins=apache_beam.io.localfilesystem.LocalFileSystem',
> u'--beam_plugins=apache_beam.io.gcp.gcsfilesystem.GCSFileSystem',
> u'--beam_plugins=apache_beam.io.filesystem_test.TestingFileSystem',
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.PipelineGraphRenderer',
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.MuteRenderer',
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.TextRenderer',
> u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.PydotRenderer',
> u'--pipelineUrl=gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1008214058-522436.1539034858.522554/pipeline.pb']
>
> 2018-10-08 14:44:45.382 PDT
> Python sdk harness failed: Traceback (most recent call last): File
> "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker_main.py",
> line 133, in main
> sdk_pipeline_options.get_all_options(drop_default=True)) File
> "/usr/local/lib/python2.7/dist-packages/apache_beam/options/pipeline_options.py",
> line 227, in get_all_options action='append' if num_times > 1 else
> 'store') File "/usr/lib/python2.7/argparse.py", line 1308, in
> add_argument return self._add_action(action) File
> "/usr/lib/python2.7/argparse.py", line 1682, in _add_action
> self._optionals._add_action(action) File
> "/usr/lib/python2.7/argparse.py", line 1509, in _add_action action =
> super(_ArgumentGroup, self)._add_action(action) File
> "/usr/lib/python2.7/argparse.py", line 1322, in _add_action
> self._check_conflict(action) File "/usr/lib/python2.7/argparse.py", line
> 1460, in _check_conflict conflict_handler(action, confl_optionals) File
> "/usr/lib/python2.7/argparse.py", line 1467, in _handle_conflict_error
> raise ArgumentError(action, message % conflict_string) ArgumentError:
> argument --beam_plugins: conflicting option string(s): --beam_plugins
>
> On Wed, Oct 10, 2018 at 1:05 AM Maximilian Michels <mxm@apache.org
> <ma...@apache.org>> wrote:
>
> Would be great to provide access to Dataflow build logs.
>
> In the meantime, could someone with access send me the logs for the job
> below?
>
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing
>
> Thanks,
> Max
>
> On 09.10.18 13:45, Maximilian Michels wrote:
> > Hi,
> >
> > I'm debugging a test failure in Dataflow PostCommit. There are logs
> > available which I can't access. Is it possible to be added to the
> > apache-beam-testing project?
> >
> > Thanks,
> > Max
> >
> >
> > Example:
> >
> ======================================================================
> > FAIL: test_streaming_with_attributes
> > (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
> >
> ----------------------------------------------------------------------
> > Traceback (most recent call last):
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
>
> > line 175, in test_streaming_with_attributes
> > self._test_streaming(with_attributes=True)
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
>
> > line 167, in _test_streaming
> > timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py",
>
> > line 91, in run_pipeline
> > result = p.run()
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/pipeline.py",
>
> > line 416, in run
> > return self.runner.run_pipeline(self)
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",
>
> > line 65, in run_pipeline
> > hc_assert_that(self.result, pickler.loads(on_success_matcher))
> > AssertionError:
> > Expected: (Test pipeline expected terminated in state: RUNNING and
> > Expected 2 messages.)
> > but: Expected 2 messages. Got 0 messages. Diffs (item, count):
> > Expected but not in actual: [(PubsubMessage(data001-seen,
> > {'processed': 'IT'}), 1), (PubsubMessage(data002-seen,
> {'timestamp_out':
> > '2018-07-11T02:02:50.149000Z', 'processed': 'IT'}), 1)]
> > Unexpected: []
> > Stripped attributes: ['id', 'timestamp']
> >
> > -------------------- >> begin captured stdout <<
> ---------------------
> > Found:
> >
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing.
>
> >
>
>
>
> --
>
>
>
>
> Got feedback? tinyurl.com/swegner-feedback
> <https://tinyurl.com/swegner-feedback>
Re: Log output from Dataflow tests
Posted by Scott Wegner <sc...@apache.org>.
I'm not sure how apache-beam-testing permissions are managed; Kenn, could
we grant read-access for contributors who need it for testing?
Here are two logs from the job that seem relevant:
2018-10-08 14:44:45.381 PDT
Parsing unknown args:
[u'--dataflowJobId=2018-10-08_14_41_03-9578125971484804239',
u'--autoscalingAlgorithm=NONE', u'--direct_runner_use_stacked_bundle',
u'--maxNumWorkers=0', u'--style=scrambled', u'--sleep_secs=20',
u'--pipeline_type_check',
u'--gcpTempLocation=gs://temp-storage-for-end-to-end-tests/temp-it/beamapp-jenkins-1008214058-522436.1539034858.522554',
u'--numWorkers=1', u'--beam_plugins=apache_beam.io.filesystem.FileSystem',
u'--beam_plugins=apache_beam.io.hadoopfilesystem.HadoopFileSystem',
u'--beam_plugins=apache_beam.io.localfilesystem.LocalFileSystem',
u'--beam_plugins=apache_beam.io.gcp.gcsfilesystem.GCSFileSystem',
u'--beam_plugins=apache_beam.io.filesystem_test.TestingFileSystem',
u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.PipelineGraphRenderer',
u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.MuteRenderer',
u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.TextRenderer',
u'--beam_plugins=apache_beam.runners.interactive.display.pipeline_graph_renderer.PydotRenderer',
u'--pipelineUrl=gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1008214058-522436.1539034858.522554/pipeline.pb']
2018-10-08 14:44:45.382 PDT
Python sdk harness failed: Traceback (most recent call last): File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker_main.py",
line 133, in main sdk_pipeline_options.get_all_options(drop_default=True))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/options/pipeline_options.py",
line 227, in get_all_options action='append' if num_times > 1 else 'store')
File "/usr/lib/python2.7/argparse.py", line 1308, in add_argument return
self._add_action(action) File "/usr/lib/python2.7/argparse.py", line 1682,
in _add_action self._optionals._add_action(action) File
"/usr/lib/python2.7/argparse.py", line 1509, in _add_action action =
super(_ArgumentGroup, self)._add_action(action) File
"/usr/lib/python2.7/argparse.py", line 1322, in _add_action
self._check_conflict(action) File "/usr/lib/python2.7/argparse.py", line
1460, in _check_conflict conflict_handler(action, confl_optionals) File
"/usr/lib/python2.7/argparse.py", line 1467, in _handle_conflict_error
raise ArgumentError(action, message % conflict_string) ArgumentError:
argument --beam_plugins: conflicting option string(s): --beam_plugins
On Wed, Oct 10, 2018 at 1:05 AM Maximilian Michels <mx...@apache.org> wrote:
> Would be great to provide access to Dataflow build logs.
>
> In the meantime, could someone with access send me the logs for the job
> below?
>
>
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing
>
> Thanks,
> Max
>
> On 09.10.18 13:45, Maximilian Michels wrote:
> > Hi,
> >
> > I'm debugging a test failure in Dataflow PostCommit. There are logs
> > available which I can't access. Is it possible to be added to the
> > apache-beam-testing project?
> >
> > Thanks,
> > Max
> >
> >
> > Example:
> > ======================================================================
> > FAIL: test_streaming_with_attributes
> > (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
> > ----------------------------------------------------------------------
> > Traceback (most recent call last):
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
>
> > line 175, in test_streaming_with_attributes
> > self._test_streaming(with_attributes=True)
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
>
> > line 167, in _test_streaming
> > timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py",
>
> > line 91, in run_pipeline
> > result = p.run()
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/pipeline.py",
>
> > line 416, in run
> > return self.runner.run_pipeline(self)
> > File
> >
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",
>
> > line 65, in run_pipeline
> > hc_assert_that(self.result, pickler.loads(on_success_matcher))
> > AssertionError:
> > Expected: (Test pipeline expected terminated in state: RUNNING and
> > Expected 2 messages.)
> > but: Expected 2 messages. Got 0 messages. Diffs (item, count):
> > Expected but not in actual: [(PubsubMessage(data001-seen,
> > {'processed': 'IT'}), 1), (PubsubMessage(data002-seen, {'timestamp_out':
> > '2018-07-11T02:02:50.149000Z', 'processed': 'IT'}), 1)]
> > Unexpected: []
> > Stripped attributes: ['id', 'timestamp']
> >
> > -------------------- >> begin captured stdout << ---------------------
> > Found:
> >
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing.
>
> >
>
--
Got feedback? tinyurl.com/swegner-feedback
Re: Log output from Dataflow tests
Posted by Maximilian Michels <mx...@apache.org>.
Would be great to provide access to Dataflow build logs.
In the meantime, could someone with access send me the logs for the job
below?
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing
Thanks,
Max
On 09.10.18 13:45, Maximilian Michels wrote:
> Hi,
>
> I'm debugging a test failure in Dataflow PostCommit. There are logs
> available which I can't access. Is it possible to be added to the
> apache-beam-testing project?
>
> Thanks,
> Max
>
>
> Example:
> ======================================================================
> FAIL: test_streaming_with_attributes
> (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
> line 175, in test_streaming_with_attributes
> self._test_streaming(with_attributes=True)
> File
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py",
> line 167, in _test_streaming
> timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
> File
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py",
> line 91, in run_pipeline
> result = p.run()
> File
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/pipeline.py",
> line 416, in run
> return self.runner.run_pipeline(self)
> File
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify_PR/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",
> line 65, in run_pipeline
> hc_assert_that(self.result, pickler.loads(on_success_matcher))
> AssertionError:
> Expected: (Test pipeline expected terminated in state: RUNNING and
> Expected 2 messages.)
> but: Expected 2 messages. Got 0 messages. Diffs (item, count):
> Expected but not in actual: [(PubsubMessage(data001-seen,
> {'processed': 'IT'}), 1), (PubsubMessage(data002-seen, {'timestamp_out':
> '2018-07-11T02:02:50.149000Z', 'processed': 'IT'}), 1)]
> Unexpected: []
> Stripped attributes: ['id', 'timestamp']
>
> -------------------- >> begin captured stdout << ---------------------
> Found:
> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-08_14_41_03-9578125971484804239?project=apache-beam-testing.
>