You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Anton Okolnychyi <an...@gmail.com> on 2017/11/01 09:41:08 UTC

[SS] Custom Sinks

Hi all,

I have a question about the future of custom data sinks in Structured
Streaming. In particular, I want to know how continuous processing and the
Datasource API V2 will impact them.

Right now, it is possible to have custom data sinks via the current
Datasource API (V1) by implementing StreamSinkProvider and Sink traits. I
am wondering if this approach is planned to be the right way or it is
better to wait for the Datasource API V2 and the final implementation of
continuous processing.

As far as I understand SPARK-20928, there will be changes in the Source API
in SS. But what about the Sink API? Is it safe to implement it now?

Best regards,
Anton

Re: [SS] Custom Sinks

Posted by Reynold Xin <rx...@databricks.com>.
They will probably both change, but I wouldn't block on the change if you
have an immediate need.


On Wed, Nov 1, 2017 at 10:41 AM, Anton Okolnychyi <
anton.okolnychyi@gmail.com> wrote:

> Hi all,
>
> I have a question about the future of custom data sinks in Structured
> Streaming. In particular, I want to know how continuous processing and the
> Datasource API V2 will impact them.
>
> Right now, it is possible to have custom data sinks via the current
> Datasource API (V1) by implementing StreamSinkProvider and Sink traits. I
> am wondering if this approach is planned to be the right way or it is
> better to wait for the Datasource API V2 and the final implementation of
> continuous processing.
>
> As far as I understand SPARK-20928, there will be changes in the Source
> API in SS. But what about the Sink API? Is it safe to implement it now?
>
> Best regards,
> Anton
>