You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/06 11:19:12 UTC

[jira] [Resolved] (SPARK-6569) Kafka directInputStream logs what appear to be incorrect warnings

     [ https://issues.apache.org/jira/browse/SPARK-6569?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-6569.
------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

Issue resolved by pull request 5366
[https://github.com/apache/spark/pull/5366]

> Kafka directInputStream logs what appear to be incorrect warnings
> -----------------------------------------------------------------
>
>                 Key: SPARK-6569
>                 URL: https://issues.apache.org/jira/browse/SPARK-6569
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.3.0
>         Environment: Spark 1.3.0
>            Reporter: Platon Potapov
>            Priority: Minor
>             Fix For: 1.4.0
>
>
> During what appears to be normal operation of streaming from a Kafka topic, the following log records are observed, logged periodically:
> {code}
> [Stage 391:==========================================>              (3 + 0) / 4]
> 2015-03-27 12:49:54 WARN KafkaRDD: Beginning offset ${part.fromOffset} is the same as ending offset skipping raw 0
> 2015-03-27 12:49:54 WARN KafkaRDD: Beginning offset ${part.fromOffset} is the same as ending offset skipping raw 0
> 2015-03-27 12:49:54 WARN KafkaRDD: Beginning offset ${part.fromOffset} is the same as ending offset skipping raw 0
> {code}
> * the part.fromOffset placeholder is not correctly substituted to a value
> * is the condition really mandates a warning being logged?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org