You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/09/09 21:44:20 UTC
[jira] [Resolved] (SPARK-17469) mapWithState causes block lock
warning
[ https://issues.apache.org/jira/browse/SPARK-17469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-17469.
-------------------------------
Resolution: Duplicate
> mapWithState causes block lock warning
> --------------------------------------
>
> Key: SPARK-17469
> URL: https://issues.apache.org/jira/browse/SPARK-17469
> Project: Spark
> Issue Type: Bug
> Components: Streaming
> Affects Versions: 2.0.0
> Environment: run-example
> Reporter: Christopher MÃ¥rtensson
> Priority: Minor
>
> run-example with StatefulNetworkWordCount gives warnings like the following
> -------------------------------------------
> Time: 1473416200000 ms
> -------------------------------------------
> 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = 1788:
> [rdd_2475_0]
> 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = 1791:
> [rdd_2475_3]
> 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = 1790:
> [rdd_2475_2]
> 16/09/09 12:16:41 WARN Executor: 1 block locks were not released by TID = 1789:
> [rdd_2475_1]
> -------------------------------------------
> Time: 1473416201000 ms
> -------------------------------------------
> 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = 1792:
> [rdd_2481_0]
> 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = 1794:
> [rdd_2481_2]
> 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = 1795:
> [rdd_2481_3]
> 16/09/09 12:16:42 WARN Executor: 1 block locks were not released by TID = 1793:
> [rdd_2481_1]
> This was also reproduced by running any other application using mapWithState.
> Only tested in local mode.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org