You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/10/01 06:27:23 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Spark #2827

See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2827/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9918] Make TryCrossLanguage match non Try API (#15633)

[noreply] [BEAM-12957] Add support for pyarrow 5.x (#15588)


------------------------------------------
[...truncated 649.89 KB...]
2021/10/01 06:18:16 Job state: RUNNING
2021/10/01 06:18:43 Job state: DONE
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/Impulse.out"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/PairWithRestriction0"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x92A\x92A\x92A"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/PairWithRestriction0"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x9fA\x9fA\x9fA"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x01\x01\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/Impulse.out"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\xc9 \xc9 \xc9 "  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split/ParMultiDo(Split).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key/ParMultiDo(AssignShard).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\xf3\x19\xf3\x19\xf3\x19"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split/ParMultiDo(Split).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x9fA\x9fA\x9fA"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x80\x1a\x80\x1a\x80\x1a"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key/ParMultiDo(AssignShard).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\xe6\x19\xe6\x19\xe6\x19"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n3"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n2"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/Remove Kafka Metadata/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n3_keyede7_1"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"G\xbd\x01\x02\x03"  labels:{key:"PCOLLECTION"  value:"n3"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\xe8\x07\xa0\x1e\x03\x04"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/Remove Kafka Metadata/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"U\xd6\x01\x02\x03"  labels:{key:"PCOLLECTION"  value:"n2"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\xe8\x07\xa8\x84\x01\x10\x11"  labels:{key:"PCOLLECTION"  value:"n3_keyede7_1"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n5"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n5_keyede7_2"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"n4"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\xe8\x07\xa8\x84\x01\x10\x11"  labels:{key:"PCOLLECTION"  value:"n5_keyede7_2"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"G\xbd\x01\x02\x03"  labels:{key:"PCOLLECTION"  value:"n5"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x01\x01\x01"  labels:{key:"PCOLLECTION"  value:"n4"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/Remove Kafka Metadata/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds/ParMultiDo(StripIds).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read/ParMultiDo(Read).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"E\xe1\x01\x03\x04"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/Remove Kafka Metadata/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"3\xae\x1fNP"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds/ParMultiDo(StripIds).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x80\x1a\x80\x1a\x80\x1a"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\xf3\x19\xf3\x19\xf3\x19"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"7\xa7\"OQ"  labels:{key:"PCOLLECTION"  value:"wsJlPfsMCYExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read/ParMultiDo(Read).output"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x00"  labels:{key:"PCOLLECTION"  value:"n9_keyede9_3"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x00"  labels:{key:"PCOLLECTION"  value:"n7"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x00"  labels:{key:"PCOLLECTION"  value:"n9"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"n6"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n8_keyede9_2"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x00"  labels:{key:"PCOLLECTION"  value:"n7_keyede9_1"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\xe8\x07"  labels:{key:"PCOLLECTION"  value:"n8"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\xe8\x07\xa8\x84\x01\x10\x11"  labels:{key:"PCOLLECTION"  value:"n8_keyede9_2"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x01\x01\x01"  labels:{key:"PCOLLECTION"  value:"n6"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"O\xe6\x01\x02\x03"  labels:{key:"PCOLLECTION"  value:"n8"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1"  type:"beam:metrics:sum_int64:v1"  payload:"\x01"  labels:{key:"PCOLLECTION"  value:"n10"}
2021/10/01 06:18:43 Failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1"  type:"beam:metrics:distribution_int64:v1"  payload:"\x01\x01\x01\x01"  labels:{key:"PCOLLECTION"  value:"n10"}
--- PASS: TestKafkaIO_BasicReadWrite (49.96s)
PASS
ok  	github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/kafka	53.019s

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > combineGloballyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:259
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > partitionTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:316
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > coGroupByKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:232
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > groupByKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:200
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > combinePerKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:277
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > flattenTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:297
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > singleInputOutputTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:158
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > multiInputOutputWithSideInputTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:180
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

8 tests completed, 8 failed

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava FAILED

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > combineGloballyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:259
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > partitionTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:316
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > coGroupByKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:232
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > groupByKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:200
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > combinePerKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:277
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > flattenTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:297
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > singleInputOutputTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:158
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > multiInputOutputWithSideInputTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:180
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

8 tests completed, 8 failed

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython FAILED

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > pythonDependenciesTest FAILED
    org.apache.beam.vendor.grpc.v1p36p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:340
        Caused by: org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at Errors.java:124
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:125

1 test completed, 1 failed

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:2:job-server:sparkJobServerCleanup

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingJava/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPythonOnly/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 52s
215 actionable tasks: 153 executed, 55 from cache, 7 up-to-date

Publishing build scan...
https://gradle.com/s/3lb7d7hfaax4m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Spark #2829

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2829/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Spark #2828

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2828/display/redirect>

Changes:


------------------------------------------
[...truncated 611.84 KB...]
    self._block(timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <grpc._utilities._ChannelReadyFuture object at 0x7ff8c01764e0>
timeout = 60

    def _block(self, timeout):
        until = None if timeout is None else time.time() + timeout
        with self._condition:
            while True:
                if self._cancelled:
                    raise grpc.FutureCancelledError()
                elif self._matured:
                    return
                else:
                    if until is None:
                        self._condition.wait()
                    else:
                        remaining = until - time.time()
                        if remaining < 0:
>                           raise grpc.FutureTimeoutError()
E                           grpc.FutureTimeoutError

../../build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_utilities.py:85: FutureTimeoutError
----------------------------- Captured stderr call -----------------------------
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
------------------------------ Captured log call -------------------------------
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
__________________ SqlTransformTest.test_windowing_before_sql __________________

self = <apache_beam.transforms.sql_test.SqlTransformTest testMethod=test_windowing_before_sql>

    def test_windowing_before_sql(self):
      with TestPipeline() as p:
        out = (
            p | beam.Create([
                SimpleRow(5, "foo", 1.),
                SimpleRow(15, "bar", 2.),
                SimpleRow(25, "baz", 3.)
            ])
            | beam.Map(lambda v: beam.window.TimestampedValue(v, v.id)).
            with_output_types(SimpleRow)
            | beam.WindowInto(
                beam.window.FixedWindows(10)).with_output_types(SimpleRow)
            | SqlTransform("SELECT COUNT(*) as `count` FROM PCOLLECTION"))
>       assert_that(out, equal_to([(1, ), (1, ), (1, )]))

apache_beam/transforms/sql_test.py:175: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:596: in __exit__
    self.result = self.run()
apache_beam/testing/test_pipeline.py:114: in run
    False if self.not_use_test_runner_api else test_runner_api))
apache_beam/pipeline.py:573: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/portability/portable_runner.py:438: in run_pipeline
    job_service_handle = self.create_job_service(options)
apache_beam/runners/portability/portable_runner.py:317: in create_job_service
    return self.create_job_service_handle(server.start(), options)
apache_beam/runners/portability/job_server.py:54: in start
    grpc.channel_ready_future(channel).result(timeout=self._timeout)
../../build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_utilities.py:139: in result
    self._block(timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <grpc._utilities._ChannelReadyFuture object at 0x7ff8c0abf588>
timeout = 60

    def _block(self, timeout):
        until = None if timeout is None else time.time() + timeout
        with self._condition:
            while True:
                if self._cancelled:
                    raise grpc.FutureCancelledError()
                elif self._matured:
                    return
                else:
                    if until is None:
                        self._condition.wait()
                    else:
                        remaining = until - time.time()
                        if remaining < 0:
>                           raise grpc.FutureTimeoutError()
E                           grpc.FutureTimeoutError

../../build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_utilities.py:85: FutureTimeoutError
----------------------------- Captured stderr call -----------------------------
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
------------------------------ Captured log call -------------------------------
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
_________________ SqlTransformTest.test_zetasql_generate_data __________________

self = <apache_beam.transforms.sql_test.SqlTransformTest testMethod=test_zetasql_generate_data>

    def test_zetasql_generate_data(self):
      with TestPipeline() as p:
        out = p | SqlTransform(
            """SELECT
              CAST(1 AS INT64) AS `int`,
              CAST('foo' AS STRING) AS `str`,
              CAST(3.14  AS FLOAT64) AS `flt`""",
            dialect="zetasql")
>       assert_that(out, equal_to([(1, "foo", 3.14)]))

apache_beam/transforms/sql_test.py:160: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:596: in __exit__
    self.result = self.run()
apache_beam/testing/test_pipeline.py:114: in run
    False if self.not_use_test_runner_api else test_runner_api))
apache_beam/pipeline.py:573: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/portability/portable_runner.py:438: in run_pipeline
    job_service_handle = self.create_job_service(options)
apache_beam/runners/portability/portable_runner.py:317: in create_job_service
    return self.create_job_service_handle(server.start(), options)
apache_beam/runners/portability/job_server.py:54: in start
    grpc.channel_ready_future(channel).result(timeout=self._timeout)
../../build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_utilities.py:139: in result
    self._block(timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <grpc._utilities._ChannelReadyFuture object at 0x7ff8c0c13048>
timeout = 60

    def _block(self, timeout):
        until = None if timeout is None else time.time() + timeout
        with self._condition:
            while True:
                if self._cancelled:
                    raise grpc.FutureCancelledError()
                elif self._matured:
                    return
                else:
                    if until is None:
                        self._condition.wait()
                    else:
                        remaining = until - time.time()
                        if remaining < 0:
>                           raise grpc.FutureTimeoutError()
E                           grpc.FutureTimeoutError

../../build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_utilities.py:85: FutureTimeoutError
----------------------------- Captured stderr call -----------------------------
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
------------------------------ Captured log call -------------------------------
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml> -
================= 9 failed, 5008 deselected in 641.78 seconds ==================

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:2:job-server:sparkJobServerCleanup

FAILURE: Build completed with 7 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/go/test/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerGoUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingJava/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPythonOnly/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

6: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

7: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 44s
215 actionable tasks: 154 executed, 54 from cache, 7 up-to-date

Publishing build scan...
https://gradle.com/s/rkxlxbn4yxase

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org