You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Platon Potapov (JIRA)" <ji...@apache.org> on 2015/04/24 13:36:52 UTC

[jira] [Created] (SPARK-7122) KafkaUtils.createDirectStream - unreasonable processing time in absence of load

Platon Potapov created SPARK-7122:
-------------------------------------

             Summary: KafkaUtils.createDirectStream - unreasonable processing time in absence of load
                 Key: SPARK-7122
                 URL: https://issues.apache.org/jira/browse/SPARK-7122
             Project: Spark
          Issue Type: Question
          Components: Streaming
    Affects Versions: 1.3.1
         Environment: Spark Streaming 1.3.1, standalone mode running on just 1 box: Ubuntu 14.04.2 LTS, 4 cores, 8GB RAM, java version "1.8.0_40"
            Reporter: Platon Potapov


attached is the complete source code of a test spark job. no external data generators are run - just the presence of a kafka topic named "raw" suffices.

the spark job is run with no load whatsoever. http://localhost:4040/streaming is checked to obtain job processing duration.

* in case the test contains the following transformation:
{code}
    // dummy transformation
    val temperature = bytes.filter(_._1 == "abc")
    val abc = temperature.window(Seconds(40), Seconds(5))
    abc.print()
{code}
the median processing time is 3 seconds 80 ms

* in case the test contains the following transformation:
{code}
    // dummy transformation
    val temperature = bytes.filter(_._1 == "abc")
    val abc = temperature.map(x => (1, x))
    abc.print()
{code}
the median processing time is just 50 ms

please explain why does the "window" transformation introduce such a growth of job duration?

note: the result is the same regardless of the number of kafka topic partitions (I've tried 1 and 8)




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org