You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "pin_zhang (Jira)" <ji...@apache.org> on 2020/03/30 12:43:00 UTC
[jira] [Commented] (SPARK-25804) JDOPersistenceManager leak when
query via JDBC
[ https://issues.apache.org/jira/browse/SPARK-25804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17070938#comment-17070938 ]
pin_zhang commented on SPARK-25804:
-----------------------------------
Any comments on this issue?
> JDOPersistenceManager leak when query via JDBC
> ----------------------------------------------
>
> Key: SPARK-25804
> URL: https://issues.apache.org/jira/browse/SPARK-25804
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: pin_zhang
> Priority: Major
> Attachments: image-2018-10-27-01-44-07-972.png
>
>
> 1. start-thriftserver.sh under SPARK2.3.1
> 2. Create Table and insert values
> create table test_leak (id string, index int);
> insert into test_leak values('id1',1)
> 3. Create JDBC Client query the table
> import java.sql.*;
> public class HiveClient {
> public static void main(String[] args) throws Exception {
> String driverName = "org.apache.hive.jdbc.HiveDriver";
> Class.forName(driverName);
> Connection con = DriverManager.getConnection( "jdbc:hive2://localhost:10000/default", "test", "test");
> Statement stmt = con.createStatement();
> String sql = "select * from test_leak";
> int loop = 1000000;
> while ( loop – > 0) {
> ResultSet rs = stmt.executeQuery(sql);
> rs.next();
> System.out.println(new java.sql.Timestamp(System.currentTimeMillis()) +" : " + rs.getString(1));
> rs.close();
> if( loop % 100 ==0){
> Thread.sleep(10000);
> }
> }
> con.close();
> }
> }
> 4. Dump HS2 heap org.datanucleus.api.jdo.JDOPersistenceManager instances keep increasing.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org