You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by pradeepbill <pr...@gmail.com> on 2017/06/30 13:47:07 UTC

spark streaming socket read issue

hi there, I have a spark streaming issue that i am not able to figure out ,
below code reads from a socket, but I don't see any input going into the
job, I have nc -l 1111 running, and dumping data though, not sure why my
spark job is not able to read data from  10.176.110.112:1111.Please advice.

Dataset<Row> d = sparkSession.readStream().format("socket")
					.option("host", "10.176.110.112").option("port", 1111).load();


thanks
Pradeep




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-socket-read-issue-tp28813.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: spark streaming socket read issue

Posted by "Shixiong(Ryan) Zhu" <sh...@databricks.com>.
Could you show the codes that start the StreamingQuery from Dataset?. If
you don't call `writeStream.....start(...)`, it won't run anything.

On Fri, Jun 30, 2017 at 6:47 AM, pradeepbill <pr...@gmail.com> wrote:

> hi there, I have a spark streaming issue that i am not able to figure out ,
> below code reads from a socket, but I don't see any input going into the
> job, I have nc -l 1111 running, and dumping data though, not sure why my
> spark job is not able to read data from  10.176.110.112:1111.Please
> advice.
>
> Dataset<Row> d = sparkSession.readStream().format("socket")
>                                         .option("host",
> "10.176.110.112").option("port", 1111).load();
>
>
> thanks
> Pradeep
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/spark-streaming-socket-read-issue-tp28813.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>