You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2016/04/18 19:23:30 UTC

Implicit from ProcessingTime to scala.concurrent.duration.Duration?

Hi,

While working with structured streaming (aka SparkSQL Streams :)) I
thought about adding

implicit def toProcessingTime(duration: Duration) = ProcessingTime(duration)

What do you think?

I think it'd improve the API:

.trigger(ProcessingTime(10 seconds))

vs

.trigger(10 seconds)

(since it's not a release feature I didn't mean to file an issue in
JIRA - please guide if needed).

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Implicit from ProcessingTime to scala.concurrent.duration.Duration?

Posted by Reynold Xin <rx...@databricks.com>.
Nope. It is unclear whether they would be useful enough or not. But when
designing APIs we always need to anticipate future changes.

On Monday, April 18, 2016, Jacek Laskowski <ja...@japila.pl> wrote:

> When you say "in the future", do you have any specific timeframe in
> mind? You got me curious :)
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Mon, Apr 18, 2016 at 7:44 PM, Reynold Xin <rxin@databricks.com
> <javascript:;>> wrote:
> > The problem with this is that we might introduce event time based
> trigger in
> > the future, and then it would be more confusing...
> >
> >
> > On Monday, April 18, 2016, Jacek Laskowski <jacek@japila.pl
> <javascript:;>> wrote:
> >>
> >> Hi,
> >>
> >> While working with structured streaming (aka SparkSQL Streams :)) I
> >> thought about adding
> >>
> >> implicit def toProcessingTime(duration: Duration) =
> >> ProcessingTime(duration)
> >>
> >> What do you think?
> >>
> >> I think it'd improve the API:
> >>
> >> .trigger(ProcessingTime(10 seconds))
> >>
> >> vs
> >>
> >> .trigger(10 seconds)
> >>
> >> (since it's not a release feature I didn't mean to file an issue in
> >> JIRA - please guide if needed).
> >>
> >> Pozdrawiam,
> >> Jacek Laskowski
> >> ----
> >> https://medium.com/@jaceklaskowski/
> >> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> >> Follow me at https://twitter.com/jaceklaskowski
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org <javascript:;>
> >> For additional commands, e-mail: dev-help@spark.apache.org
> <javascript:;>
> >>
> >
>

Re: Implicit from ProcessingTime to scala.concurrent.duration.Duration?

Posted by Jacek Laskowski <ja...@japila.pl>.
When you say "in the future", do you have any specific timeframe in
mind? You got me curious :)

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Mon, Apr 18, 2016 at 7:44 PM, Reynold Xin <rx...@databricks.com> wrote:
> The problem with this is that we might introduce event time based trigger in
> the future, and then it would be more confusing...
>
>
> On Monday, April 18, 2016, Jacek Laskowski <ja...@japila.pl> wrote:
>>
>> Hi,
>>
>> While working with structured streaming (aka SparkSQL Streams :)) I
>> thought about adding
>>
>> implicit def toProcessingTime(duration: Duration) =
>> ProcessingTime(duration)
>>
>> What do you think?
>>
>> I think it'd improve the API:
>>
>> .trigger(ProcessingTime(10 seconds))
>>
>> vs
>>
>> .trigger(10 seconds)
>>
>> (since it's not a release feature I didn't mean to file an issue in
>> JIRA - please guide if needed).
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Implicit from ProcessingTime to scala.concurrent.duration.Duration?

Posted by Reynold Xin <rx...@databricks.com>.
The problem with this is that we might introduce event time based trigger
in the future, and then it would be more confusing...

On Monday, April 18, 2016, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> While working with structured streaming (aka SparkSQL Streams :)) I
> thought about adding
>
> implicit def toProcessingTime(duration: Duration) =
> ProcessingTime(duration)
>
> What do you think?
>
> I think it'd improve the API:
>
> .trigger(ProcessingTime(10 seconds))
>
> vs
>
> .trigger(10 seconds)
>
> (since it's not a release feature I didn't mean to file an issue in
> JIRA - please guide if needed).
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org <javascript:;>
> For additional commands, e-mail: dev-help@spark.apache.org <javascript:;>
>
>