You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Asim Jalis <as...@gmail.com> on 2014/12/05 18:32:24 UTC

Spark Streaming Reusing JDBC Connections

Is there a way I can have a JDBC connection open through a streaming job. I
have a foreach which is running once per batch. However, I don’t want to
open the connection for each batch but would rather have a persistent
connection that I can reuse. How can I do this?

Thanks.

Asim

RE: Spark Streaming Reusing JDBC Connections

Posted by Ashic Mahtab <as...@live.com>.
I've done this:

1. foreachPartition
2. Open connection.
3. foreach inside the partition.
4. close the connection.

Slightly crufty, but works. Would love to see a better approach.

Regards,
Ashic.

Date: Fri, 5 Dec 2014 12:32:24 -0500
Subject: Spark Streaming Reusing JDBC Connections
From: asimjalis@gmail.com
To: user@spark.apache.org

Is there a way I can have a JDBC connection open through a streaming job. I have a foreach which is running once per batch. However, I don’t want to open the connection for each batch but would rather have a persistent connection that I can reuse. How can I do this?

Thanks.
Asim