You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/12/10 14:04:30 UTC
Build failed in Jenkins: beam_PostCommit_Python_Verify #6787
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6787/display/redirect?page=changes>
Changes:
[huangry] Add instructions to post-commit policy web page, according to
[huangry] Update website/src/contribute/postcommits-policies-details.md
------------------------------------------
[...truncated 308.32 KB...]
namenode_1 | INFO: Root resource classes found:
namenode_1 | class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1 | Dec 10, 2018 1:18:04 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1 | INFO: Provider classes found:
namenode_1 | class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1 | class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1 | Dec 10, 2018 1:18:04 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1 | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1 | Dec 10, 2018 1:18:05 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1 | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1 | DEBUG http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1 | DEBUG Uploading 1 files using 1 thread(s).
test_1 | DEBUG Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1 | INFO Writing to '/kinglear.txt'.
test_1 | DEBUG Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1 | DEBUG http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1 | DEBUG Starting new HTTP connection (1): datanode:50075
datanode_1 | 18/12/10 13:18:05 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1 | 18/12/10 13:18:05 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1 | 18/12/10 13:18:05 INFO datanode.DataNode: Receiving BP-595030248-172.18.0.2-1544447836759:blk_1073741825_1001 src: /172.18.0.3:35098 dest: /172.18.0.3:50010
datanode_1 | 18/12/10 13:18:06 INFO DataNode.clienttrace: src: /172.18.0.3:35098, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_135923172_67, offset: 0, srvID: ad2f573f-ab75-4b5d-bfc9-20bbb58a664f, blockid: BP-595030248-172.18.0.2-1544447836759:blk_1073741825_1001, duration: 16169505
datanode_1 | 18/12/10 13:18:06 INFO datanode.DataNode: PacketResponder: BP-595030248-172.18.0.2-1544447836759:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1 | 18/12/10 13:18:06 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /kinglear.txt
namenode_1 | 18/12/10 13:18:06 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1 | 18/12/10 13:18:06 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_135923172_67
test_1 | DEBUG Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1 | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1 | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7faf1032c500> ====================
test_1 | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7faf1032c398> ====================
test_1 | INFO:root:==================== <function lift_combiners at 0x7faf106996e0> ====================
test_1 | INFO:root:==================== <function expand_gbk at 0x7faf10699488> ====================
test_1 | INFO:root:==================== <function sink_flattens at 0x7faf10699578> ====================
test_1 | INFO:root:==================== <function greedily_fuse at 0x7faf10641410> ====================
test_1 | INFO:root:==================== <function impulse_to_input at 0x7faf106999b0> ====================
test_1 | INFO:root:==================== <function inject_timer_pcollections at 0x7faf106417d0> ====================
test_1 | INFO:root:==================== <function sort_stages at 0x7faf10641578> ====================
test_1 | INFO:root:==================== <function window_pcollection_coders at 0x7faf106415f0> ====================
test_1 | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1 | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1 | 18/12/10 13:18:07 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1 | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1 | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1 | 18/12/10 13:18:09 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-08bab586fc7e11e880bb0242ac120004/5e06c397-54a3-4c68-8bfd-379b2ff72dd5.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1 | 18/12/10 13:18:09 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-08bab586fc7e11e880bb0242ac120004/5e06c397-54a3-4c68-8bfd-379b2ff72dd5.py-wordcount-integration
datanode_1 | 18/12/10 13:18:09 INFO datanode.DataNode: Receiving BP-595030248-172.18.0.2-1544447836759:blk_1073741826_1002 src: /172.18.0.3:35116 dest: /172.18.0.3:50010
datanode_1 | 18/12/10 13:18:09 INFO DataNode.clienttrace: src: /172.18.0.3:35116, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1393186134_69, offset: 0, srvID: ad2f573f-ab75-4b5d-bfc9-20bbb58a664f, blockid: BP-595030248-172.18.0.2-1544447836759:blk_1073741826_1002, duration: 5879629
datanode_1 | 18/12/10 13:18:09 INFO datanode.DataNode: PacketResponder: BP-595030248-172.18.0.2-1544447836759:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1 | 18/12/10 13:18:09 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-08bab586fc7e11e880bb0242ac120004/5e06c397-54a3-4c68-8bfd-379b2ff72dd5.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_1393186134_69
test_1 | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1 | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1 | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1 | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1 | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1 | INFO:root:number of empty lines: 1663
test_1 | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ...
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ...
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ... done
Aborting on container exit...
real 1m17.745s
user 0m1.005s
sys 0m0.191s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6787 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_1 ...
Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ...
Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ...
Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_net
real 0m0.649s
user 0m0.259s
sys 0m0.086s
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 2 mins 28.165 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.
> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''
###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline
if [[ -z $PIPELINE_OPTS ]]; then
# Check that the script is running in a known directory.
if [[ $PWD != *sdks/python* ]]; then
echo 'Unable to locate Apache Beam Python SDK root directory'
exit 1
fi
# Go to the Apache Beam Python SDK root
if [[ "*sdks/python" != $PWD ]]; then
cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
fi
# Create a tarball if not exists
if [[ $(find ${SDK_LOCATION}) ]]; then
SDK_LOCATION=$(find ${SDK_LOCATION})
else
python setup.py -q sdist
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
fi
# Install test dependencies for ValidatesRunner tests.
echo "pyhamcrest" > postcommit_requirements.txt
echo "mock" >> postcommit_requirements.txt
# Options used to run testing pipeline on Cloud Dataflow Service. Also used for
# running on DirectRunner (some options ignored).
opts=(
"--runner=$RUNNER"
"--project=$PROJECT"
"--staging_location=$GCS_LOCATION/staging-it"
"--temp_location=$GCS_LOCATION/temp-it"
"--output=$GCS_LOCATION/py-it-cloud/output"
"--sdk_location=$SDK_LOCATION"
"--requirements_file=postcommit_requirements.txt"
"--num_workers=$NUM_WORKERS"
"--sleep_secs=$SLEEP_SECS"
)
# Add --streaming if provided
if [[ "$STREAMING" = true ]]; then
opts+=("--streaming")
fi
# Add --dataflow_worker_jar if provided
if [[ ! -z "$WORKER_JAR" ]]; then
opts+=("--dataflow_worker_jar=$WORKER_JAR")
fi
PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")
fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"
###########################################################################
# Run tests and validate that jobs finish successfully.
echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
--test-pipeline-options="$PIPELINE_OPTS" \
$TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 2753.798s
OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-16782271963442100812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_26_06-5215672516099076790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_33_29-8186462617952180136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_39_15-4753622910723972224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_45_07-2327722867894913170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_51_43-10077632862119004298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_58_15-13670819857058327589?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_48-1874236607803171656?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-491409402266755890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_47-8603110114272425538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_31_27-6974289105382371380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_37_40-16107392902980871693?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_43_33-12224031943308507181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-8989471799269995084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_43-9164888784766002425?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_24_52-14940826842618409191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_32_15-1197881686568611222?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-15971653544795751510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_26_52-11007745241646050502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_34_14-7056313100898702335?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_40_05-14487547893361462227?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_45-11602252167180115616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_26_46-11781859920948038434?project=apache-beam-testing.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 45 mins 54.621 secs.
FAILURE: Build failed with an exception.
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159
* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 49m 22s
6 actionable tasks: 6 executed
Publishing build scan...
https://gradle.com/s/3jpomnohtxsio
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Python_Verify
#6790
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6790/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python_Verify #6789
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6789/display/redirect?page=changes>
Changes:
[iemejia] [BEAM-6079] Add ability for CassandraIO to delete data
[iemejia] [BEAM-6079] Fix access level and clean up generics issues
------------------------------------------
[...truncated 1.14 MB...]
{
"@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "ReadFromPubSub/Map(_from_proto_str).out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s1"
},
"serialized_fn": "ref_AppliedPTransform_ReadFromPubSub/Map(_from_proto_str)_4",
"user_name": "ReadFromPubSub/Map(_from_proto_str)"
}
},
{
"kind": "ParallelDo",
"name": "s3",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "add_attribute"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "add_attribute.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s2"
},
"serialized_fn": "ref_AppliedPTransform_add_attribute_5",
"user_name": "add_attribute"
}
},
{
"kind": "ParallelDo",
"name": "s4",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "to_proto_str"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "WriteToPubSub/ToProtobuf.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s3"
},
"serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
"user_name": "WriteToPubSub/ToProtobuf"
}
},
{
"kind": "ParallelWrite",
"name": "s5",
"properties": {
"display_data": [],
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"format": "pubsub",
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s4"
},
"pubsub_id_label": "id",
"pubsub_serialized_attributes_fn": "",
"pubsub_timestamp_label": "timestamp",
"pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output349ef07d-0943-47a4-a416-9b29a8731644",
"user_name": "WriteToPubSub/Write/NativeWrite"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
createTime: u'2018-12-10T16:46:48.654574Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2018-12-10_08_46_45-4661422501458528405'
location: u'us-central1'
name: u'beamapp-jenkins-1210164636-429196'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2018-12-10T16:46:48.654574Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2018-12-10_08_46_45-4661422501458528405]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_45-4661422501458528405?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 19 tests in 1887.088s
FAILED (errors=5, failures=6)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_31-485909928086186037?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-2746877262249700722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_28-16554391804803232926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-10136866855711679295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-16707170736943022074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_43_59-14550076557050917467?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_45_15-16396278941296888174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_18-13056604862306843484?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_45-4661422501458528405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_26-15932434091359080518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_44_04-3692776795664872478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_45_11-1132200751986424363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_16-5927959844519459605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_28-18153654139549890044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_43_23-3015796963868094813?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_43_45-7614230511101895736?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_44_05-483436329258915908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-18165015813611854332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_44_08-12554187595833307794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_19-6828310122637119890?project=apache-beam-testing.
> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 31 mins 28.072 secs.
FAILURE: Build failed with an exception.
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274
* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 35m 28s
6 actionable tasks: 6 executed
Publishing build scan...
https://gradle.com/s/7rv5yqqf2newe
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python_Verify #6788
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6788/display/redirect?page=changes>
Changes:
[gleb] [BEAM-5866] Override structuralValue in ListCoder
[gleb] [BEAM-5866] Override structuralValue in MapCoder
[github] Clarify usage of PipelineOptions subclass
[coheigea] Move string literals to the left hand side of the expression in a few
[coheigea] Upgrade to Apache Tika 19.1
[robertwb] [BEAM-4444] Parquet IO for Python SDK (#6763)
------------------------------------------
[...truncated 150.11 KB...]
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6788
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6788
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6788 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6788 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
runtime/cgo: pthread_create failed: Resource temporarily unavailable
SIGABRT: abort
PC=0x7f65f80a3428 m=0
goroutine 0 [idle]:
goroutine 1 [running, locked to thread]:
runtime.systemstack_switch()
/usr/local/go/src/runtime/asm_amd64.s:252 fp=0xc42005fe48 sp=0xc42005fe40
runtime.newproc(0x0, 0xee95c8)
/usr/local/go/src/runtime/proc.go:2713 +0x8b fp=0xc42005fe90 sp=0xc42005fe48
runtime.init.3()
/usr/local/go/src/runtime/proc.go:213 +0x35 fp=0xc42005feb0 sp=0xc42005fe90
runtime.init()
/usr/local/go/src/runtime/write_err.go:14 +0x2ee fp=0xc42005ff48 sp=0xc42005feb0
runtime.main()
/usr/local/go/src/runtime/proc.go:141 +0xf6 fp=0xc42005ffa0 sp=0xc42005ff48
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1 fp=0xc42005ffa8 sp=0xc42005ffa0
goroutine 17 [syscall, locked to thread]:
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1
rax 0x0
rbx 0x7f65f8433700
rcx 0x7f65f80a3428
rdx 0x6
rdi 0x668d
rsi 0x668d
rbp 0xf1d11e
rsp 0x7ffc1f294008
r8 0x7f65f8434770
r9 0x7f65f8a6d700
r10 0x8
r11 0x206
r12 0x33081a0
r13 0xf3
r14 0x30
r15 0x3
rip 0x7f65f80a3428
rflags 0x206
cs 0x33
fs 0x0
gs 0x0
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 0.067 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.
> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''
###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline
if [[ -z $PIPELINE_OPTS ]]; then
# Check that the script is running in a known directory.
if [[ $PWD != *sdks/python* ]]; then
echo 'Unable to locate Apache Beam Python SDK root directory'
exit 1
fi
# Go to the Apache Beam Python SDK root
if [[ "*sdks/python" != $PWD ]]; then
cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
fi
# Create a tarball if not exists
if [[ $(find ${SDK_LOCATION}) ]]; then
SDK_LOCATION=$(find ${SDK_LOCATION})
else
python setup.py -q sdist
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
fi
# Install test dependencies for ValidatesRunner tests.
echo "pyhamcrest" > postcommit_requirements.txt
echo "mock" >> postcommit_requirements.txt
# Options used to run testing pipeline on Cloud Dataflow Service. Also used for
# running on DirectRunner (some options ignored).
opts=(
"--runner=$RUNNER"
"--project=$PROJECT"
"--staging_location=$GCS_LOCATION/staging-it"
"--temp_location=$GCS_LOCATION/temp-it"
"--output=$GCS_LOCATION/py-it-cloud/output"
"--sdk_location=$SDK_LOCATION"
"--requirements_file=postcommit_requirements.txt"
"--num_workers=$NUM_WORKERS"
"--sleep_secs=$SLEEP_SECS"
)
# Add --streaming if provided
if [[ "$STREAMING" = true ]]; then
opts+=("--streaming")
fi
# Add --dataflow_worker_jar if provided
if [[ ! -z "$WORKER_JAR" ]]; then
opts+=("--dataflow_worker_jar=$WORKER_JAR")
fi
PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")
fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
./scripts/run_integration_test.sh: fork: retry: Resource temporarily unavailable
./scripts/run_integration_test.sh: fork: retry: No child processes
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"
###########################################################################
# Run tests and validate that jobs finish successfully.
echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
--test-pipeline-options="$PIPELINE_OPTS" \
$TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
Process SyncManager-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/usr/lib/python2.7/multiprocessing/managers.py", line 558, in _run_server
server.serve_forever()
File "/usr/lib/python2.7/multiprocessing/managers.py", line 184, in serve_forever
t.start()
File "/usr/lib/python2.7/threading.py", line 736, in start
_start_new_thread(self.__bootstrap, ())
error: can't start new thread
Traceback (most recent call last):
File "setup.py", line 239, in <module>
'test': generate_protos_first(test),
File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/__init__.py",> line 143, in setup
return distutils.core.setup(**attrs)
File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
dist.run_commands()
File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/commands.py",> line 158, in run
TestProgram(argv=argv, config=self.__config)
File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 121, in __init__
**extra_args)
File "/usr/lib/python2.7/unittest/main.py", line 95, in __init__
self.runTests()
File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 207, in runTests
result = self.testRunner.run(self.test)
File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 365, in run
testQueue = Queue()
File "/usr/lib/python2.7/multiprocessing/managers.py", line 667, in temp
token, exp = self._create(typeid, *args, **kwds)
File "/usr/lib/python2.7/multiprocessing/managers.py", line 565, in _create
conn = self._Client(self._address, authkey=self._authkey)
File "/usr/lib/python2.7/multiprocessing/connection.py", line 175, in Client
answer_challenge(c, authkey)
File "/usr/lib/python2.7/multiprocessing/connection.py", line 432, in answer_challenge
message = connection.recv_bytes(256) # reject large message
EOFError
> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 3.512 secs.
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159
* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313
* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274
* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
BUILD FAILED in 45s
6 actionable tasks: 6 executed
Publishing build scan...
https://gradle.com/s/wkkfpyf22gtru
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org