You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Douglas (JIRA)" <ji...@apache.org> on 2013/10/01 12:41:26 UTC
[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after
multiple open/closed JDBC connections.
[ https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13782799#comment-13782799 ]
Douglas commented on HIVE-5296:
-------------------------------
Ok thanks -- In this case, my system was opening a lot of connections with fewer file handles ([HIVE-4501|https://issues.apache.org/jira/browse/HIVE-4501] is file handle memory leak). Some of these connections/queries also threw Exceptions -- which probably exacerbated the problem. I'll watch the other issue, but if I see see steady heap usage over time as the total number of connections increases, we can mark resolved.
> Memory leak: OOM Error after multiple open/closed JDBC connections.
> --------------------------------------------------------------------
>
> Key: HIVE-5296
> URL: https://issues.apache.org/jira/browse/HIVE-5296
> Project: Hive
> Issue Type: Bug
> Components: HiveServer2
> Affects Versions: 0.12.0, 0.13.0
> Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
> Reporter: Douglas
> Labels: hiveserver
> Fix For: 0.12.0, 0.13.0
>
> Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, HIVE-5296.patch, HIVE-5296.patch
>
> Original Estimate: 168h
> Remaining Estimate: 168h
>
> This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
> However, on inspection of the related patch and my built version of Hive (patch carried forward to 0.12.0), I am still seeing the described behaviour.
> Multiple connections to Hiveserver2, all of which are closed and disposed of properly show the Java heap size to grow extremely quickly.
> This issue can be recreated using the following code
> {code}
> import java.sql.DriverManager;
> import java.sql.Connection;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.sql.Statement;
> import java.util.Properties;
> import org.apache.hive.service.cli.HiveSQLException;
> import org.apache.log4j.Logger;
> /*
> * Class which encapsulates the lifecycle of a query or statement.
> * Provides functionality which allows you to create a connection
> */
> public class HiveClient {
>
> Connection con;
> Logger logger;
> private static String driverName = "org.apache.hive.jdbc.HiveDriver";
> private String db;
>
>
> public HiveClient(String db)
> {
> logger = Logger.getLogger(HiveClient.class);
> this.db=db;
>
> try{
> Class.forName(driverName);
> }catch(ClassNotFoundException e){
> logger.info("Can't find Hive driver");
> }
>
> String hiveHost = GlimmerServer.config.getString("hive/host");
> String hivePort = GlimmerServer.config.getString("hive/port");
> String connectionString = "jdbc:hive2://"+hiveHost+":"+hivePort +"/default";
> logger.info(String.format("Attempting to connect to %s",connectionString));
> try{
> con = DriverManager.getConnection(connectionString,"","");
> }catch(Exception e){
> logger.error("Problem instantiating the connection"+e.getMessage());
> }
> }
>
> public int update(String query)
> {
> Integer res = 0;
> Statement stmt = null;
> try{
> stmt = con.createStatement();
> String switchdb = "USE "+db;
> logger.info(switchdb);
> stmt.executeUpdate(switchdb);
> logger.info(query);
> res = stmt.executeUpdate(query);
> logger.info("Query passed to server");
> stmt.close();
> }catch(HiveSQLException e){
> logger.info(String.format("HiveSQLException thrown, this can be valid, " +
> "but check the error: %s from the query %s",query,e.toString()));
> }catch(SQLException e){
> logger.error(String.format("Unable to execute query SQLException %s. Error: %s",query,e));
> }catch(Exception e){
> logger.error(String.format("Unable to execute query %s. Error: %s",query,e));
> }
>
> if(stmt!=null)
> try{
> stmt.close();
> }catch(SQLException e){
> logger.error("Cannot close the statment, potentially memory leak "+e);
> }
>
> return res;
> }
>
> public void close()
> {
> if(con!=null){
> try {
> con.close();
> } catch (SQLException e) {
> logger.info("Problem closing connection "+e);
> }
> }
> }
>
>
>
> }
> {code}
> And by creating and closing many HiveClient objects. The heap space used by the hiveserver2 runjar process is seen to increase extremely quickly, without such space being released.
--
This message was sent by Atlassian JIRA
(v6.1#6144)