You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Benjamin Kim <bb...@gmail.com> on 2017/01/05 17:57:55 UTC

PostgreSQL JDBC Connections

We are getting “out of shared memory” errors when multiple users are running SQL queries against our PostgreSQL DB either simultaneously or throughout the day. When this happens, Zeppelin 0.6.0 becomes unresponsive for any more SQL queries. It looks like this is being caused by too many locks being taken and not released, transactions never closing, and/or connections never closing. Has anyone encountered Zeppelin 0.6.0 such an issue as this? If so, is there a solution for it?

Thanks,
Ben

Re: PostgreSQL JDBC Connections

Posted by t p <ta...@gmail.com>.
YMMV and I don’t think my approach will work for your use case.

Here is a suggestion based on what I’ve done. In the first paragraph you can register tables with code as such.

%spark
val example = sqlContext.read.format("jdbc").options(
    Map("url" -> "jdbc:postgresql://localhost:5432/db_name",
        "driver" -> "org.postgresql.Driver",
        "user" -> “user_name", 
        "password" -> “password",
        "dbtable" -> “example" )).load()

assets.createOrReplaceTempView(“example")        


Subsequently using spark.sql (limited to SQL syntax/semantics) supported by Spark. If there are Postgres specific features or some advanced SQL functions/syntax, this approach may not work for you.

%spark.sql
select * from example

The rest of the reports/visuals exports to CSV will look identical.



> On Jan 5, 2017, at 5:30 PM, Benjamin Kim <bb...@gmail.com> wrote:
> 
> We are using the JDBC interpreter. The business analysts only know SQL and run ad-hoc queries for their report exports to CSV.
> 
> Cheers,
> Ben
> 
> 
>> On Jan 5, 2017, at 2:21 PM, t p <ta...@gmail.com> wrote:
>> 
>> Are you using JDBC or the PSQL interpreter? I had encountered something similar while using the PSQL interpreter and I had to restart Zeppelin. 
>> 
>> My experience using PSQL (Postgresql, HAWK) was not as good as using spark/scala wrappers (JDBC data source) to connect via JDBC and then register temp tables. This approach allowed me to work with dynamic forms in a more meaningful way e.g. use SQL results to create a new drop down to drive the next page etc…
>> 
>> 
>> 
>>> On Jan 5, 2017, at 12:57 PM, Benjamin Kim <bb...@gmail.com> wrote:
>>> 
>>> We are getting “out of shared memory” errors when multiple users are running SQL queries against our PostgreSQL DB either simultaneously or throughout the day. When this happens, Zeppelin 0.6.0 becomes unresponsive for any more SQL queries. It looks like this is being caused by too many locks being taken and not released, transactions never closing, and/or connections never closing. Has anyone encountered Zeppelin 0.6.0 such an issue as this? If so, is there a solution for it?
>>> 
>>> Thanks,
>>> Ben
>> 
> 


Re: PostgreSQL JDBC Connections

Posted by Benjamin Kim <bb...@gmail.com>.
We are using the JDBC interpreter. The business analysts only know SQL and run ad-hoc queries for their report exports to CSV.

Cheers,
Ben


> On Jan 5, 2017, at 2:21 PM, t p <ta...@gmail.com> wrote:
> 
> Are you using JDBC or the PSQL interpreter? I had encountered something similar while using the PSQL interpreter and I had to restart Zeppelin. 
> 
> My experience using PSQL (Postgresql, HAWK) was not as good as using spark/scala wrappers (JDBC data source) to connect via JDBC and then register temp tables. This approach allowed me to work with dynamic forms in a more meaningful way e.g. use SQL results to create a new drop down to drive the next page etc…
> 
> 
> 
>> On Jan 5, 2017, at 12:57 PM, Benjamin Kim <bb...@gmail.com> wrote:
>> 
>> We are getting “out of shared memory” errors when multiple users are running SQL queries against our PostgreSQL DB either simultaneously or throughout the day. When this happens, Zeppelin 0.6.0 becomes unresponsive for any more SQL queries. It looks like this is being caused by too many locks being taken and not released, transactions never closing, and/or connections never closing. Has anyone encountered Zeppelin 0.6.0 such an issue as this? If so, is there a solution for it?
>> 
>> Thanks,
>> Ben
> 


Re: PostgreSQL JDBC Connections

Posted by t p <ta...@gmail.com>.
Are you using JDBC or the PSQL interpreter? I had encountered something similar while using the PSQL interpreter and I had to restart Zeppelin. 

My experience using PSQL (Postgresql, HAWK) was not as good as using spark/scala wrappers (JDBC data source) to connect via JDBC and then register temp tables. This approach allowed me to work with dynamic forms in a more meaningful way e.g. use SQL results to create a new drop down to drive the next page etc…



> On Jan 5, 2017, at 12:57 PM, Benjamin Kim <bb...@gmail.com> wrote:
> 
> We are getting “out of shared memory” errors when multiple users are running SQL queries against our PostgreSQL DB either simultaneously or throughout the day. When this happens, Zeppelin 0.6.0 becomes unresponsive for any more SQL queries. It looks like this is being caused by too many locks being taken and not released, transactions never closing, and/or connections never closing. Has anyone encountered Zeppelin 0.6.0 such an issue as this? If so, is there a solution for it?
> 
> Thanks,
> Ben