You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Aniket Bhatnagar (JIRA)" <ji...@apache.org> on 2015/07/08 14:30:04 UTC
[jira] [Created] (SPARK-8895) MetricsSystem.removeSource not called
in StreamingContext.stop
Aniket Bhatnagar created SPARK-8895:
---------------------------------------
Summary: MetricsSystem.removeSource not called in StreamingContext.stop
Key: SPARK-8895
URL: https://issues.apache.org/jira/browse/SPARK-8895
Project: Spark
Issue Type: Bug
Components: Streaming
Affects Versions: 1.4.0
Reporter: Aniket Bhatnagar
Priority: Minor
StreamingContext calls env.metricsSystem.registerSource during its construction but does not call env.metricsSystem.removeSource. Therefore, if a user attempts to restart a Streaming job in the same JVM by creating a new instance of StreamingContext with the same application name, it results in exceptions like the following in the log:
[info] o.a.s.m.MetricsSystem -Metrics already registered
java.lang.IllegalArgumentException: A metric named <>.StreamingMetrics.streaming.lastReceivedBatch_processingEndTime already exists
at com.codahale.metrics.MetricRegistry.register(MetricRegistry.java:91)~[metrics-core-3.1.0.jar:3.1.0]
at com.codahale.metrics.MetricRegistry.registerAll(MetricRegistry.java:385)~[metrics-core-3.1.0.jar:3.1.0]
at com.codahale.metrics.MetricRegistry.register(MetricRegistry.java:85)~[metrics-core-3.1.0.jar:3.1.0]
at org.apache.spark.metrics.MetricsSystem.registerSource(MetricsSystem.scala:148) ~[spark-core_2.11-1.4.0.jar:1.4.0]
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:199) [spark-streaming_2.11-1.4.0.jar:1.4.0]
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:71) [spark-streaming_2.11-1.4.0.jar:1.4.0]
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org