You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/05/10 12:28:43 UTC
Build failed in Jenkins: beam_LoadTests_Java_ParDo_Dataflow_Batch
#3
See <https://builds.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Batch/3/display/redirect?page=changes>
Changes:
[jkp] Loosen up dependency specifications
[samuelw] GrpcWindmillWorker improvements: - Log error statuses returned from
[github] Add upper bounds.
[ttanay100] withNumFileShards must be used when using withTriggeringFrequency
[ehudm] [BEAM-6669] Set Dataflow KMS key name (Python)
[juta.staes] [BEAM-6769] write bytes to bigquery in python 2
[kcweaver] [BEAM-6760] website: disable testing external links by default
[kcweaver] add website postcommit test that checks external links
[amaliujia] [BEAM-7070] JOIN condition should accept field access
[amaliujia] [sql] ignore Nexmark SQL queries that has non equal join.
[amaliujia] [sql] generalize RexInputRef and RexFieldAccess in JOIN.
[roman] fix splitting into filters and buckets
[roman] tidy up println
[mxm] [BEAM-3863] Ensure correct firing of processing time timers
[daniel.o.programmer] [BEAM-7087] Creating "errors" package internal to Beam Go SDK
[mxm] [BEAM-7128] Parallelism is unavailable when applying
[github] Do not use snappy if python-snappy is not installed
[kcweaver] [BEAM-7110] Add Spark master option to SparkJobServerDriver
[github] fix
[github] fix
[markliu] [BEAM-6908] Support Python3 performance benchmarks - part 2
[ehudm] [BEAM-7062] Fix pydoc for @retry.with_exponential_backoff
[ehudm] Make utility functions private ...
[ttanay100] [BEAM-5503] Update BigQueryIO time partitioning doc snippet
[amaliujia] [SQL] expose a public entry for toRowList.
[bhulette] Add null check to fieldToAvatica
[kcweaver] [BEAM-7133] make Spark job server Gradle task pass along system
[aromanenko.dev] [BEAM-4552] Use Spark AccumulatorsV2 for metrics
[mxm] [BEAM-7012] Support TestStream in streaming Flink Runner
[thw] [BEAM-7015] Remove duplicate standard_coders.yaml
[markliu] Fix Python wordcount benchmark by specifiying beam_it_module
[harshit] refactor : standardize the kotlin samples
[markliu] [BEAM-6908] New Jenkins branch for Python35 benchmark
[daniel.o.programmer] [BEAM-5709] Changing sleeps to CountdownLatch in test
[heejong] [BEAM-7143] adding withConsumerConfigUpdates
[25622840+adude3141] Merge pull request #8388: Add OSSIndex CVE audit gradle plugin
[github] Fix small bug in top documentation
[robertwb] Allow inference of CombiningValueState coders.
[mxm] Increase Flink parallelism for ValidatesRunner tests
[mxm] [BEAM-7029] Add KafkaIO.Write as an external transform
[github] Merge pull request #8385: Fix broken parsing of uuid
[apilloud] Website changes for 2.12.0
[apilloud] Update release guide with feedback from 2.12.0 release
[apilloud] Add 2.12.0 Blog post.
[robert] Avoid depending on context formating in hooks_test
[pabloem] [BEAM-7139] Blogpost for Kotlin Samples (#8391)
[robertwb] [BEAM-7157] Allow creation of BinaryCombineFn from lambdas.
[robertwb] [BEAM-2939] Support SDF expansion in the Flink runner.
[pabloem] Add some pydoc for runner classes in Python
[valentyn] Adds a possibility to pass a project that hosts Cloud Spanner instance
[iemejia] [BEAM-7162] Add ValueProvider to CassandraIO Write
[iemejia] [BEAM-6479] Deprecate AvroIO.RecordFormatter
[iemejia] [BEAM-5311] Delete docker build images
[ehudm] Python Datastore IO using google-cloud-datastore
[ajamato] Update dataflow container version to beam-master-20190426
[daniel.o.programmer] [BEAM-7087] Updating Go SDK errors in base beam directory
[daniel.o.programmer] [BEAM-7087] Moving "errors" package to its own subdirectory.
[github] Merge pull request #8311: Allow Schema field selections in DoFn using
[iemejia] [BEAM-7076] Update Spark runner to use spark version 2.4.2
[iemejia] [BEAM-6526] Add ReadFiles and ParseFiles transforms for AvroIO
[relax] Ensure that all nested schemas are given ids, and fix bug where nullable
[thw] [BEAM-7112] Timer race with state cleanup - take two
[robertwb] fixup
[mxm] [BEAM-7170] Fix exception when retrieving ExpansionServer port after
[aromanenko.dev] [BEAM-6850] Use HadoopFormatIOIT for perfomance/integration testing
[chamikara] Updates Dataflow runner to support external ParDos.
[kedin] [BEAM-7072][SQL][Nexmark] Disable Query5
[daniel.o.programmer] [BEAM-7087] Updating Go SDK errors (Part 1)
[daniel.o.programmer] [BEAM-7178] Adding simple package comment to Go "errors" package.
[juta.staes] [BEAM-6621][BEAM-6624] add direct runner and dataflow runner it test
[ajamato] [BEAM-4374] Emit SampledByteCount distribution tuple system metric from
[pabloem] Support the newest version of httplib2 (#8263)
[github] Add clarifying comment for MonitoringInfoLabel usage.
[lcwik] [BEAM-3344] Changes for serialize/deserialize PipelineOptions via
[amaliujia] [BEAM-7166] Add more checks on join condition.
[kcweaver] [BEAM-7176] don't reuse Spark context in tests (causes OOM)
[25622840+adude3141] [BEAM-6859] align teardown with setup calls also for empty streaming
[25622840+adude3141] [BEAM-7193] ParDoLifecycleTest: remove duplicated inner class
[github] Revert "[BEAM-5709] Changing sleeps to CountdownLatch in test"
[chamikara] Bumps down the httplib2 version since it conflicts with
[ehudm] Add pip check invocations to all tox environments.
[iemejia] [BEAM-6526] Remove unneeded methods in AvroIO ReadFiles and ParseFiles
[iemejia] [BEAM-6526] Refactor AvroIO Read/Parse to use FileIO +
[iemejia] [BEAM-6526] Add populateDisplayData on AvroIO.ReadFiles
[iemejia] [BEAM-7196] Add Display Data to FileIO Match/MatchAll
[lcwik] [BEAM-6669] Set Dataflow KMS key name (#8296)
[migryz] FnApiMonitoringInfoToCounterUpdateTranformer with User counters
[leo.unc] Use TFRecord to store intermediate cache results using PCollection's
[migryz] Downgrade logging level to avoid log spam
[kcweaver] [BEAM-7201] Go README: update task name
[lcwik] [BEAM-7179] Correct file extension (#8434)
[pabloem] Fixing fileio tests for windows
[iemejia] [BEAM-6605] Deprecate TextIO.readAll and TextIO.ReadAll transform
[iemejia] [BEAM-6605] Refactor TextIO.Read and its Tests to use FileIO + ReadFiles
[iemejia] [BEAM-6606] Deprecate AvroIO ReadAll and ParseAll transforms
[iemejia] [BEAM-7205] Remove MongoDb withKeepAlive configuration
[mxm] [BEAM-7171] Ensure bundle finalization during Flink checkpoint
[mxm] [website] Link design document on cross-language and legacy IO
[aromanenko.dev] [BEAM-6850] Add metrics collectioning to HadoopFormatIOIT
[iemejia] [BEAM-7211] Implement HasDefaultTracker interface in ByteKeyRange
[yoshiki.obata] [BEAM-7137] encode header to bytes when writing to file at
[ehudm] Update Dataflow BEAM_CONTAINER_VERSION
[yifanzou] [BEAM-7213] fix beam_PostRelease_NightlySnapshot
[relax] Fix non-determistic row access
[daniel.o.programmer] [BEAM-7154] Updating Go SDK errors (Part 2)
[bhulette] Move schema assignment onto Create builder
[yifanzou] [BEAM-7202] fix inventory jobs
[github] Adding documentation to DirectRunner functions. (#8464)
[angoenka] Update beam_fn_api.proto to specify allowed split points. (#8476)
[aromanenko.dev] [BEAM-6247] Remove deprecated module “hadoop-input-format”
[mxm] [BEAM-7192] Fix partitioning of buffered elements during checkpointing
[github] Update setup.py for pyarrow
[robert] fix format string errors with errors package.
[robert] Update Go protos.
[heejong] [BEAM-7008] adding UTF8 String coder to Java SDK ModelCoder
[heejong] [BEAM-7102] Adding `jar_packages` experiment option for Python SDK
[bhulette] Fix EXCEPT DISTINCT behavior
[iemejia] Refine Spark ValidatesRunner exclusions
[github] Add Bigtable to supported Beam connectors
[pabloem] [BEAM-2939] Initial SyntheticSDF as Source and add an Synthetic pipeline
[kcweaver] [BEAM-6966] Spark portable runner: translate READ
[amaliujia] [sql] reject aggreagtion distinct rather than executing distinct as all.
[juta.staes] [BEAM-7066] re-add python 3.6 and 3.7 precommit test suites
[iemejia] [BEAM-7227] Instantiate PipelineRunner from options to support other
[iemejia] Categorize missing unbounded NeedsRunner tests in sdks/java/core
[frederik.bode] Fix fastavro on Python 3
[frederik.bode] fixup: Fix fastavro on Python 3
[iemejia] Minor fixes related to NeedsRunner/ValidatesRunner tests
[mxm] [BEAM-5197] Guard access to test timer service by a lock
[25622840+adude3141] [BEAM-7229] ParDoLifecycleTest: remove duplicated test methods
[github] Adding PyDoc to CombiningValueStateSpec (#8477)
[iemejia] Add UsesSchema category to Schema transform tests
[iemejia] Categorize TestStreamsTest tests that use processing time
[pabloem] Skipping fileio tests on windows
[richard.moorhead] [BEAM-7216] reinstate checks for Kafka client methods
[markliu] Fix project path in beam_PerformanceTests_Python35
[relax] Merge pull request #8485: Fixed bug where fieldIdsAccessed order could
[ankurgoenka] Fix py3 "Map modified during iteration"
[echauchot] Add Structured streaming Spark Runner design document
[michal.walenia] [BEAM-7028] Add Combine Java load tests
[pabloem] Fixing fileio tests for windows
[boyuanz] Move test_sdf_synthetic_source to FnApiRunnerTest
[kcweaver] [BEAM-7214] Run Python validates runner tests on Spark
[leo.unc] Revert "Use TFRecord to store intermediate cache results using
[amaliujia] [BEAM-5644] make Planner configurable.
[pabloem] [BEAM-6986] TypeHints Py3 Error: Multiple tests fail with a generator
[25622840+adude3141] [BEAM-7197] ensure DoFn is torn down after exception thrown in lifecycle
[kedin] [SQL] Remove TYPE_ prefix from DataCatalogTableProvider
[kedin] [SQL] Refactor BeamSqlEnv
[kedin] [SQL][Fix] Fix DataCatalog MAP type
[github] [BEAM-7241] Bump the version of apitools.
[github] [BEAM-6892] Adding support for side inputs on table & schema. Also
[github] Adding Python samples to the Timely (and stateful) Processing post.
[kedin] [SQL] Upgrade DataCatalog client to 0.4.0-alpha
[iemejia] [BEAM-7239] Add withDataSourceProviderFn() to JdbcIO
[iemejia] [BEAM-7041] Refactor pool creation to be done via composition
[mxm] [BEAM-7012] Support TestStream in portable Flink Runner
[kcweaver] [BEAM-7134] Spark portable runner: executable stage sinks should have
[github] Update reviewer list for Py3 contributions.
[iemejia] [BEAM-7238] Make sfl4j-simple a runtime only dependency
[iemejia] [BEAM-7238] Make sfl4j-jdk14 a runtime only dependency
[github] [BEAM-7173] Avoiding schema autodetection by default in WriteToBigQuery
[ankurgoenka] Add and disable tests for optimized portable pipeline
[samuelw] Modify StreamingDataflowWorker commit loop to only create a commit
[github] Changes to SDF API to use DoFn Params (#8430)
[amaliujia] [sql] fix non return bug.
[rezarokni] Add Snippets for 3 new Beam Patterns 2 X FilePatterns 1 X SideInput
[ankurgoenka] Add Python optimized test for Flink
[michal.walenia] [BEAM-7228] Add a Jenkins job running ValidatesRunner tests for Direct
[michal.walenia] [BEAM-7005] Add streaming GBK load test
[ankurgoenka] Moving to 2.14.0-SNAPSHOT on master branch.
[heejong] [BEAM-7257] adding withProducerConfigUpdates
[iemejia] [BEAM-7263] Deprecate set/getClientConfiguration in JdbcIO
[iemejia] [BEAM-7265] Update Spark runner to use spark version 2.4.3
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Batch/ws/>
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
> git --version # timeout=10
> git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6679b00138a5b82a6a55e7bc94c453957cea501c (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 6679b00138a5b82a6a55e7bc94c453957cea501c
Commit message: "Merge pull request #8548: [BEAM-7265] Update Spark runner to use spark version 2.4.3"
> git rev-list --no-walk 443412d9650eb5da4af7250e7c851853574b754e # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content
SPARK_LOCAL_IP=127.0.0.1
[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_ParDo_Dataflow_Batch] $ /bin/bash -xe /tmp/jenkins6360043337913812409.sh
+ echo src Load test: ParDo 2GB 100 byte records 10 times src
src Load test: ParDo 2GB 100 byte records 10 times src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=org.apache.beam.sdk.loadtests.ParDoLoadTest -Prunner=:beam-runners-google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Dataflow_batch_ParDo_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_dataflow_batch_ParDo_1 --sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90} --iterations=10 --numberOfCounters=1 --numberOfCounterOperations=0 --maxNumWorkers=5 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=false --runner=DataflowRunner' :beam-sdks-java-load-tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :beam-sdks-java-core:generateAvroProtocol NO-SOURCE
> Task :beam-runners-local-java-core:processResources NO-SOURCE
> Task :beam-runners-core-construction-java:processResources NO-SOURCE
> Task :beam-runners-core-java:processResources NO-SOURCE
> Task :beam-sdks-java-fn-execution:processResources NO-SOURCE
> Task :beam-vendor-sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :beam-sdks-java-harness:processResources NO-SOURCE
> Task :beam-sdks-java-extensions-google-cloud-platform-core:processResources NO-SOURCE
> Task :beam-runners-java-fn-execution:processResources NO-SOURCE
> Task :beam-sdks-java-core:generateAvroJava NO-SOURCE
> Task :beam-runners-direct-java:processResources NO-SOURCE
> Task :beam-model-job-management:extractProto
> Task :beam-model-fn-execution:extractProto
> Task :beam-sdks-java-extensions-protobuf:extractProto
> Task :beam-sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:processResources NO-SOURCE
> Task :beam-sdks-java-io-synthetic:processResources NO-SOURCE
> Task :beam-sdks-java-test-utils:processResources NO-SOURCE
> Task :beam-sdks-java-io-kafka:processResources NO-SOURCE
> Task :beam-sdks-java-io-kinesis:processResources NO-SOURCE
> Task :beam-sdks-java-io-google-cloud-platform:processResources NO-SOURCE
> Task :beam-sdks-java-load-tests:processResources NO-SOURCE
> Task :beam-model-fn-execution:processResources
> Task :beam-runners-google-cloud-dataflow-java:processResources
> Task :beam-model-job-management:processResources
> Task :beam-sdks-java-core:generateGrammarSource FROM-CACHE
> Task :beam-sdks-java-core:processResources
> Task :beam-runners-google-cloud-dataflow-java-windmill:extractIncludeProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:extractProto
> Task :beam-model-pipeline:extractIncludeProto
> Task :beam-model-pipeline:extractProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:generateProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:compileJava FROM-CACHE
> Task :beam-runners-google-cloud-dataflow-java-windmill:processResources
> Task :beam-runners-google-cloud-dataflow-java-windmill:classes
> Task :beam-model-pipeline:generateProto
> Task :beam-model-pipeline:compileJava FROM-CACHE
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-job-management:generateProto
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes
> Task :beam-runners-google-cloud-dataflow-java-windmill:shadowJar
> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:shadowJar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:classes
> Task :beam-sdks-java-core:shadowJar
> Task :beam-sdks-java-extensions-protobuf:extractIncludeProto
> Task :beam-sdks-java-extensions-protobuf:generateProto NO-SOURCE
> Task :beam-runners-local-java-core:compileJava FROM-CACHE
> Task :beam-runners-local-java-core:classes UP-TO-DATE
> Task :beam-runners-local-java-core:shadowJar
> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava FROM-CACHE
> Task :beam-sdks-java-extensions-google-cloud-platform-core:classes UP-TO-DATE
> Task :beam-sdks-java-fn-execution:compileJava FROM-CACHE
> Task :beam-sdks-java-fn-execution:classes UP-TO-DATE
> Task :beam-sdks-java-io-kafka:compileJava FROM-CACHE
> Task :beam-sdks-java-io-kafka:classes UP-TO-DATE
> Task :beam-vendor-sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :beam-vendor-sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :beam-sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :beam-sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :beam-sdks-java-extensions-protobuf:shadowJar
> Task :beam-runners-core-construction-java:compileJava FROM-CACHE
> Task :beam-runners-core-construction-java:classes UP-TO-DATE
> Task :beam-sdks-java-io-kafka:shadowJar
> Task :beam-sdks-java-fn-execution:shadowJar
> Task :beam-runners-core-construction-java:shadowJar
> Task :beam-runners-core-java:compileJava FROM-CACHE
> Task :beam-runners-core-java:classes UP-TO-DATE
> Task :beam-runners-core-java:shadowJar
> Task :beam-vendor-sdks-java-extensions-protobuf:shadowJar
> Task :beam-sdks-java-extensions-google-cloud-platform-core:shadowJar
> Task :beam-sdks-java-test-utils:compileJava FROM-CACHE
> Task :beam-sdks-java-test-utils:classes UP-TO-DATE
> Task :beam-sdks-java-harness:compileJava FROM-CACHE
> Task :beam-sdks-java-harness:classes UP-TO-DATE
> Task :beam-sdks-java-io-synthetic:compileJava
Note: <https://builds.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Batch/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
> Task :beam-sdks-java-harness:jar
> Task :beam-sdks-java-io-google-cloud-platform:compileJava FROM-CACHE
> Task :beam-sdks-java-io-google-cloud-platform:classes UP-TO-DATE
> Task :beam-sdks-java-io-google-cloud-platform:shadowJar
> Task :beam-runners-google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :beam-runners-google-cloud-dataflow-java:classes
> Task :beam-sdks-java-test-utils:shadowJar
> Task :beam-sdks-java-io-synthetic:classes
> Task :beam-sdks-java-io-kinesis:compileJava
> Task :beam-sdks-java-io-kinesis:classes
> Task :beam-sdks-java-io-kinesis:shadowJar
> Task :beam-runners-google-cloud-dataflow-java:shadowJar
> Task :beam-sdks-java-io-synthetic:shadowJar
> Task :beam-sdks-java-harness:shadowJar
> Task :beam-runners-java-fn-execution:compileJava FROM-CACHE
> Task :beam-runners-java-fn-execution:classes UP-TO-DATE
> Task :beam-runners-java-fn-execution:shadowJar
> Task :beam-runners-direct-java:compileJava FROM-CACHE
> Task :beam-runners-direct-java:classes UP-TO-DATE
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:compileJava FROM-CACHE
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:classes UP-TO-DATE
> Task :beam-runners-direct-java:shadowJar
> Task :beam-sdks-java-load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :beam-sdks-java-load-tests:classes
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:shadowJar
> Task :beam-sdks-java-load-tests:shadowJar
> Task :beam-sdks-java-load-tests:run FAILED
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalArgumentException: Please mark non-public interface org.apache.beam.sdk.loadtests.ParDoLoadTest$Options as public. The JVM requires that all non-public interfaces to be in the same package which will prevent the PipelineOptions proxy class to implement all of the interfaces.
at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:191)
at org.apache.beam.sdk.options.PipelineOptionsFactory.validateClass(PipelineOptionsFactory.java:952)
at org.apache.beam.sdk.options.PipelineOptionsFactory.access$2200(PipelineOptionsFactory.java:115)
at org.apache.beam.sdk.options.PipelineOptionsFactory$Cache.validateWellFormed(PipelineOptionsFactory.java:1893)
at org.apache.beam.sdk.options.PipelineOptionsFactory$Cache.validateWellFormed(PipelineOptionsFactory.java:1834)
at org.apache.beam.sdk.options.PipelineOptionsFactory$Cache.access$800(PipelineOptionsFactory.java:1770)
at org.apache.beam.sdk.options.PipelineOptionsFactory.parseObjects(PipelineOptionsFactory.java:1587)
at org.apache.beam.sdk.options.PipelineOptionsFactory.access$400(PipelineOptionsFactory.java:115)
at org.apache.beam.sdk.options.PipelineOptionsFactory$Builder.as(PipelineOptionsFactory.java:298)
at org.apache.beam.sdk.loadtests.LoadTestOptions.readFromArgs(LoadTestOptions.java:72)
at org.apache.beam.sdk.loadtests.LoadTest.<init>(LoadTest.java:74)
at org.apache.beam.sdk.loadtests.ParDoLoadTest.<init>(ParDoLoadTest.java:68)
at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:87)
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-sdks-java-load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1m 20s
70 actionable tasks: 49 executed, 21 from cache
Publishing build scan...
https://gradle.com/s/jjrgi5umfgwmy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal :
beam_LoadTests_Java_ParDo_Dataflow_Batch #4
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Batch/4/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org