You are viewing a plain text version of this content. The canonical link for it is here.
Posted to derby-dev@db.apache.org by "Sunitha Kambhampati (JIRA)" <de...@db.apache.org> on 2006/01/09 18:41:37 UTC
[jira] Commented: (DERBY-802) OutofMemory Error when reading large
blob when statement type is ResultSet.TYPE_SCROLL_INSENSITIVE
[ http://issues.apache.org/jira/browse/DERBY-802?page=comments#action_12362225 ]
Sunitha Kambhampati commented on DERBY-802:
-------------------------------------------
>From list, Dan suggested the following workaround Change statement type to default ResultSet.TYPE_FORWARD_ONLY and this works ok.
===================
I modified the repro and tried to do a read of a 100mb blob, did a heap profiler run and found the following trace.
percent live alloc'ed stack class
rank self accum bytes objs bytes objs trace name
1 59.59% 59.59% 30653040 935 31682912 1046 2050 [B
2 32.61% 92.20% 16777232 1 33489040 9 7268 [B
3 3.12% 95.33% 1606416 49 1737552 53 6935 [B
4 0.35% 95.68% 181632 946 194688 1014 2014 org.apache.derby.impl.store.raw.data.StoredPage
......
I havent looked very closely here but it seems to me that some materialization is happening by looking at this trace and code. )
TRACE 7268:
org.apache.derby.iapi.types.SQLBinary.readFromStream(SQLBinary.java:384)
org.apache.derby.iapi.types.SQLBinary.readExternal(SQLBinary.java:301)
org.apache.derby.iapi.types.SQLBinary.getValue(SQLBinary.java:194)
org.apache.derby.iapi.types.SQLBinary.getClone(SQLBinary.java:498)
> OutofMemory Error when reading large blob when statement type is ResultSet.TYPE_SCROLL_INSENSITIVE
> --------------------------------------------------------------------------------------------------
>
> Key: DERBY-802
> URL: http://issues.apache.org/jira/browse/DERBY-802
> Project: Derby
> Type: Bug
> Components: JDBC
> Versions: 10.0.2.0, 10.0.2.1, 10.1.1.0, 10.1.1.1, 10.1.1.2, 10.1.2.0, 10.1.2.1, 10.2.0.0, 10.1.3.0, 10.1.2.2, 10.0.2.2
> Environment: all
> Reporter: Sunitha Kambhampati
>
> Grégoire Dubois on the list reported this problem. From his mail: the reproduction is attached below.
> When statement type is set to ResultSet.TYPE_SCROLL_INSENSITIVE, outofmemory exception is thrown when reading large blobs.
> import java.sql.*;
> import java.io.*;
> /**
> *
> * @author greg
> */
> public class derby_filewrite_fileread {
>
> private static File file = new File("/mnt/BigDisk/Clips/BabyMamaDrama-JShin.wmv");
> private static File destinationFile = new File("/home/greg/DerbyDatabase/"+file.getName());
>
> /** Creates a new instance of derby_filewrite_fileread */
> public derby_filewrite_fileread() {
>
> }
>
> public static void main(String args[]) {
> try {
> Class.forName("org.apache.derby.jdbc.EmbeddedDriver").newInstance();
> Connection connection = DriverManager.getConnection ("jdbc:derby:/home/greg/DerbyDatabase/BigFileTestDB;create=true", "APP", "");
> connection.setAutoCommit(false);
>
> Statement statement = connection.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE, ResultSet.CONCUR_READ_ONLY);
> ResultSet result = statement.executeQuery("SELECT TABLENAME FROM SYS.SYSTABLES");
>
> // Create table if it doesn't already exists.
> boolean exist=false;
> while ( result.next() ) {
> if ("db_file".equalsIgnoreCase(result.getString(1)))
> exist=true;
> }
> if ( !exist ) {
> System.out.println("Create table db_file.");
> statement.execute("CREATE TABLE db_file ("+
> " name VARCHAR(40),"+
> " file BLOB(2G) NOT NULL)");
> connection.commit();
> }
>
> // Read file from disk, write on DB.
> System.out.println("1 - Read file from disk, write on DB.");
> PreparedStatement preparedStatement=connection.prepareStatement("INSERT INTO db_file(name,file) VALUES (?,?)");
> FileInputStream fileInputStream = new FileInputStream(file);
> preparedStatement.setString(1, file.getName());
> preparedStatement.setBinaryStream(2, fileInputStream, (int)file.length());
> preparedStatement.execute();
> connection.commit();
> System.out.println("2 - END OF Read file from disk, write on DB.");
>
>
> // Read file from DB, and write on disk.
> System.out.println("3 - Read file from DB, and write on disk.");
> result = statement.executeQuery("SELECT file FROM db_file WHERE name='"+file.getName()+"'");
> byte[] buffer = new byte [1024];
> result.next();
> BufferedInputStream inputStream=new BufferedInputStream(result.getBinaryStream(1),1024);
> FileOutputStream outputStream = new FileOutputStream(destinationFile);
> int readBytes = 0;
> while (readBytes!=-1) {
> readBytes=inputStream.read(buffer,0,buffer.length);
> if ( readBytes != -1 )
> outputStream.write(buffer, 0, readBytes);
> }
> inputStream.close();
> outputStream.close();
> System.out.println("4 - END OF Read file from DB, and write on disk.");
> }
> catch (Exception e) {
> e.printStackTrace(System.err);
> }
> }
> }
> It returns
> 1 - Read file from disk, write on DB.
> 2 - END OF Read file from disk, write on DB.
> 3 - Read file from DB, and write on disk.
> java.lang.OutOfMemoryError
> if the file is ~10MB or more
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira