You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by AnilKumar B <ak...@gmail.com> on 2016/10/25 21:35:39 UTC

Operator push down through JDBC driver

Hi,

I am using Spark SQL to transform data. My Source is ORACLE, In general, I
am extracting multiple tables and joining them and then doing some other
transformations in Spark.

Is there any possibility for pushing down join operator to ORACLE using
SPARK SQL, instead of fetching and joining in Spark? I am unable find any
options for these optimizations rules at
https://spark.apache.org/docs/1.6.0/sql-programming-guide.html#jdbc-to-other-databases
.

I am currently using spark-1.6 version.

Thanks & Regards,
B Anil Kumar.

Re: Operator push down through JDBC driver

Posted by AnilKumar B <ak...@gmail.com>.
I thought, we can use  sqlContext.sql("some join query") API with jdbc,
that's why I have asked the above question.

But as we can only use
sqlContext.read().format("jdbc").options(options).load() and here we can
use actual join query of ORACLE source.

So this question is not valid. Please ignore it.

Thanks & Regards,
B Anil Kumar.

On Tue, Oct 25, 2016 at 2:35 PM, AnilKumar B <ak...@gmail.com> wrote:

> Hi,
>
> I am using Spark SQL to transform data. My Source is ORACLE, In general, I
> am extracting multiple tables and joining them and then doing some other
> transformations in Spark.
>
> Is there any possibility for pushing down join operator to ORACLE using
> SPARK SQL, instead of fetching and joining in Spark? I am unable find any
> options for these optimizations rules at https://spark.apache.org/docs/
> 1.6.0/sql-programming-guide.html#jdbc-to-other-databases.
>
> I am currently using spark-1.6 version.
>
> Thanks & Regards,
> B Anil Kumar.
>