You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Artemis User <ar...@dtechspace.com> on 2022/01/06 03:25:09 UTC

JDBCConnectionProvider in Spark

Could someone provide some insight/examples on the usage of this API? 
https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html

Why is it needed since this is an abstract class and there isn't any 
concrete implementation of it?   Thanks a lot in advance.

-- ND

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: JDBCConnectionProvider in Spark

Posted by Gabor Somogyi <ga...@gmail.com>.
We've expected that it would be hard to understand all the aspects at first
so created an explanation for it.
Please see the following readme which hopefully answers most of your
questions:
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/jdbc/README.md


On Thu, Jan 6, 2022 at 3:31 PM Sean Owen <sr...@gmail.com> wrote:

> They're in core/
> under org.apache.spark.sql.execution.datasources.jdbc.connection
> I don't quite understand, it's an abstraction over lots of concrete
> implementations, just simple software design here.
> You can implement your own provider too I suppose.
>
> On Thu, Jan 6, 2022 at 8:22 AM Artemis User <ar...@dtechspace.com>
> wrote:
>
>> The only example I saw in the Spark distribution was
>> ExampleJdbcConnectionProvider file in the examples directory.  It basically
>> just wraps the abstract class with overriding methods.  I guess my question
>> was since Spark embeds the JDBC APIs in the DataFrame reader and writer,
>> why such provider API is still needed?  Is there any use cases for using
>> the provider API instead of the dataframe reader/writer when dealing with
>> JDBC?  Thanks!
>>
>> On 1/6/22 9:09 AM, Sean Owen wrote:
>>
>> There are 8 concrete implementations of it? OracleConnectionProvider, etc
>>
>> On Wed, Jan 5, 2022 at 9:26 PM Artemis User <ar...@dtechspace.com>
>> wrote:
>>
>>> Could someone provide some insight/examples on the usage of this API?
>>>
>>> https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html
>>>
>>> Why is it needed since this is an abstract class and there isn't any
>>> concrete implementation of it?   Thanks a lot in advance.
>>>
>>> -- ND
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>>
>>>
>>

Re: JDBCConnectionProvider in Spark

Posted by Sean Owen <sr...@gmail.com>.
They're in core/
under org.apache.spark.sql.execution.datasources.jdbc.connection
I don't quite understand, it's an abstraction over lots of concrete
implementations, just simple software design here.
You can implement your own provider too I suppose.

On Thu, Jan 6, 2022 at 8:22 AM Artemis User <ar...@dtechspace.com> wrote:

> The only example I saw in the Spark distribution was
> ExampleJdbcConnectionProvider file in the examples directory.  It basically
> just wraps the abstract class with overriding methods.  I guess my question
> was since Spark embeds the JDBC APIs in the DataFrame reader and writer,
> why such provider API is still needed?  Is there any use cases for using
> the provider API instead of the dataframe reader/writer when dealing with
> JDBC?  Thanks!
>
> On 1/6/22 9:09 AM, Sean Owen wrote:
>
> There are 8 concrete implementations of it? OracleConnectionProvider, etc
>
> On Wed, Jan 5, 2022 at 9:26 PM Artemis User <ar...@dtechspace.com>
> wrote:
>
>> Could someone provide some insight/examples on the usage of this API?
>>
>> https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html
>>
>> Why is it needed since this is an abstract class and there isn't any
>> concrete implementation of it?   Thanks a lot in advance.
>>
>> -- ND
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>
>

Re: JDBCConnectionProvider in Spark

Posted by Artemis User <ar...@dtechspace.com>.
The only example I saw in the Spark distribution was 
ExampleJdbcConnectionProvider file in the examples directory.  It 
basically just wraps the abstract class with overriding methods.  I 
guess my question was since Spark embeds the JDBC APIs in the DataFrame 
reader and writer, why such provider API is still needed? Is there any 
use cases for using the provider API instead of the dataframe 
reader/writer when dealing with JDBC?  Thanks!

On 1/6/22 9:09 AM, Sean Owen wrote:
> There are 8 concrete implementations of it? OracleConnectionProvider, etc
>
> On Wed, Jan 5, 2022 at 9:26 PM Artemis User <ar...@dtechspace.com> 
> wrote:
>
>     Could someone provide some insight/examples on the usage of this API?
>     https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html
>
>     Why is it needed since this is an abstract class and there isn't any
>     concrete implementation of it?   Thanks a lot in advance.
>
>     -- ND
>
>     ---------------------------------------------------------------------
>     To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>

Re: JDBCConnectionProvider in Spark

Posted by Sean Owen <sr...@gmail.com>.
There are 8 concrete implementations of it? OracleConnectionProvider, etc

On Wed, Jan 5, 2022 at 9:26 PM Artemis User <ar...@dtechspace.com> wrote:

> Could someone provide some insight/examples on the usage of this API?
>
> https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html
>
> Why is it needed since this is an abstract class and there isn't any
> concrete implementation of it?   Thanks a lot in advance.
>
> -- ND
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>