You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by td...@apache.org on 2015/12/23 00:33:39 UTC

spark git commit: [SPARK-12487][STREAMING][DOCUMENT] Add docs for Kafka message handler

Repository: spark
Updated Branches:
  refs/heads/master b374a2583 -> 93db50d1c


[SPARK-12487][STREAMING][DOCUMENT] Add docs for Kafka message handler

Author: Shixiong Zhu <sh...@databricks.com>

Closes #10439 from zsxwing/kafka-message-handler-doc.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/93db50d1
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/93db50d1
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/93db50d1

Branch: refs/heads/master
Commit: 93db50d1c2ff97e6eb9200a995e4601f752968ae
Parents: b374a25
Author: Shixiong Zhu <sh...@databricks.com>
Authored: Tue Dec 22 15:33:30 2015 -0800
Committer: Tathagata Das <ta...@gmail.com>
Committed: Tue Dec 22 15:33:30 2015 -0800

----------------------------------------------------------------------
 docs/streaming-kafka-integration.md | 3 +++
 1 file changed, 3 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/93db50d1/docs/streaming-kafka-integration.md
----------------------------------------------------------------------
diff --git a/docs/streaming-kafka-integration.md b/docs/streaming-kafka-integration.md
index 5be73c4..9454714 100644
--- a/docs/streaming-kafka-integration.md
+++ b/docs/streaming-kafka-integration.md
@@ -104,6 +104,7 @@ Next, we discuss how to use this approach in your streaming application.
 			[key class], [value class], [key decoder class], [value decoder class] ](
 			streamingContext, [map of Kafka parameters], [set of topics to consume])
 
+	You can also pass a `messageHandler` to `createDirectStream` to access `MessageAndMetadata` that contains metadata about the current message and transform it to any desired type.
 	See the [API docs](api/scala/index.html#org.apache.spark.streaming.kafka.KafkaUtils$)
 	and the [example]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/DirectKafkaWordCount.scala).
 	</div>
@@ -115,6 +116,7 @@ Next, we discuss how to use this approach in your streaming application.
 				[key class], [value class], [key decoder class], [value decoder class],
 				[map of Kafka parameters], [set of topics to consume]);
 
+	You can also pass a `messageHandler` to `createDirectStream` to access `MessageAndMetadata` that contains metadata about the current message and transform it to any desired type.
 	See the [API docs](api/java/index.html?org/apache/spark/streaming/kafka/KafkaUtils.html)
 	and the [example]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/java/org/apache/spark/examples/streaming/JavaDirectKafkaWordCount.java).
 
@@ -123,6 +125,7 @@ Next, we discuss how to use this approach in your streaming application.
 		from pyspark.streaming.kafka import KafkaUtils
 		directKafkaStream = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": brokers})
 
+	You can also pass a `messageHandler` to `createDirectStream` to access `KafkaMessageAndMetadata` that contains metadata about the current message and transform it to any desired type.
 	By default, the Python API will decode Kafka data as UTF8 encoded strings. You can specify your custom decoding function to decode the byte arrays in Kafka records to any arbitrary data type. See the [API docs](api/python/pyspark.streaming.html#pyspark.streaming.kafka.KafkaUtils)
 	and the [example]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/python/streaming/direct_kafka_wordcount.py).
 	</div>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org