You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "Yan Zhou.sc" <Ya...@huawei.com> on 2014/06/06 20:26:56 UTC

Opiq for SParkSQL?

Can anybody share your thoughts/comments/interests of applicability of the "optiq"  framework to Spark, and SparkSQL in particular?

Thanks,

Re: Opiq for SParkSQL?

Posted by Christopher Nguyen <ct...@adatao.com>.
Yan, it looks like Julian did anticipate exactly this possibility:

https://github.com/julianhyde/optiq/tree/master/spark

Optiq is a cool project vision in terms of hiding various engines behind
one consistent API.

That said, from just the Spark perspective, I don't see a huge value add to
layer Optiq above SparkSQL---until and unless Optiq provides a lot more
idioms and/or operational facilities than just making Spark RDDs look like
tables, which SparkSQL already does quite nicely and increasingly.
Warehousing, perhaps?

Here, I can't avoid a mention of DDF which aims to add more algorithmic and
data manipulation value in addition to the table abstraction (
https://spark-summit.org/2014/talk/distributed-dataframe-ddf-on-apache-spark-simplifying-big-data-for-the-rest-of-us
)
--
Christopher T. Nguyen
Co-founder & CEO, Adatao <http://adatao.com>
linkedin.com/in/ctnguyen



On Fri, Jun 6, 2014 at 11:26 AM, Yan Zhou.sc <Ya...@huawei.com> wrote:

> Can anybody share your thoughts/comments/interests of applicability of the
> "optiq"  framework to Spark, and SparkSQL in particular?
>
> Thanks,
>