You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by "Tomoya Tainaka (JIRA)" <ji...@apache.org> on 2011/04/26 04:00:03 UTC

[jira] [Created] (MAPREDUCE-2453) DBinputFormat throws OutOfMemory exceptions when importing large tables from PostgreSQL.

DBinputFormat throws OutOfMemory exceptions when importing large tables from PostgreSQL.
----------------------------------------------------------------------------------------

                 Key: MAPREDUCE-2453
                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-2453
             Project: Hadoop Map/Reduce
          Issue Type: Bug
    Affects Versions: 0.21.0
            Reporter: Tomoya Tainaka
            Priority: Minor


DBinputFormat throws OutOfMemory exceptions and Java heap space errors when importing large tables from PostgreSQL. This is because loading the whole resultset into memory at the same time. As a consequence, the Java heap space is exhausted.
Also, this problem happens with DataDrivenDBInputFormat.

I suggest utilizing "Fetch Size" parameter of JDBC within DBInputFormat. 
I made patches as follows:
・modifying org.apache.hadoop.mapreduce.lib.db.DBInputFormat. 
・creating org.apache.hadoop.mapreduce.PostgreSQLDBRecordReader.java. 
・adding JDBC options of fetch size, which are setFetchSize() and setAutoCommit().

I also modified DataDrivenDBinputFormat in the same way as DBInputFormat.


--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira