You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by "Murugan, Muniraja (CTR) Offshore" <Mu...@Cigna.com> on 2014/02/13 17:20:37 UTC

ORA-04030: out of process memory when trying to allocate 4032 bytes - Please advise

Dear All,

I am getting "ORA-04030: out of process memory when trying to allocate 4032 bytes" error when I try to index xmltype data from Oracle DB. I have registered in Solr support the ID is "SOLR-5723"

I have xmltype data in my oracle DB. I am converting xmltype to clob and to xml string using clobtransformer. When I process more than 1 lakh records, I am getting the following error.
ERROR - 2014-02-06 16:42:04.957; org.apache.solr.common.SolrException; getNext() failed for query 'select XMLSERIALIZE(CONTENT object_value AS CLOB NO INDENT) POLICY_DOC,id,rowid from ifp_policy':org.apache.solr.handler.dataimport.DataImportHandlerException: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 4032 bytes (qmxtgCreateBuf,kghsseg: kolaslCreateCtx)
at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:63)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:368)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$600(JdbcDataSource.java:254)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:289)
at org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:116)
at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:75)
at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:243)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:469)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:408)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:323)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:231)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:411)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:476)
at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:457)
Caused by: java.sql.SQLException: ORA-04030: out of process memory when trying to allocate 4032 bytes (qmxtgCreateBuf,kghsseg: kolaslCreateCtx)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:331)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:288)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:743)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:207)
at oracle.jdbc.driver.T4CStatement.fetch(T4CStatement.java:1018)
at oracle.jdbc.driver.OracleResultSetImpl.close_or_fetch_from_next(OracleResultSetImpl.java:291)
at oracle.jdbc.driver.OracleResultSetImpl.next(OracleResultSetImpl.java:213)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:360)
... 12 more
My db-database-config.xml file content is as follows.
<entity name="IFPPOLICY" processor="SqlEntityProcessor" dataSource="ds1" transformer="ClobTransformer"
query="select XMLSERIALIZE(CONTENT object_value AS CLOB NO INDENT) POLICY_DOC,primaryCustomerId,id,rowid, relationshipCode,givenName,lastName from ifp_policy,XMLTABLE(xmlnamespaces('http://www.cigna.com/ifp/domains/policy/2013/05' 
AS " pol ", 'http://www.cigna.com/ifp/domains/common/2012/06' AS " cm ",'http://www.cigna.com/ifp/domains/common/eligibility/2013/05' 
AS " cel ",'http://www.cigna.com/ifp/domains/utility/2012/06' AS util),'/pol:insurancePolicy' PASSING ifp_policy.OBJECT_VALUE 
columns primaryCustomerId VARCHAR2(100) path '@primaryCustomerID',relationshipCode VARCHAR2(50) path 'cel:customers/cel:customer[cel:customerInformation/cel:relationshipCode=" Primary "]/cel:name/cm:givenName',
givenName VARCHAR2(50) path 'cel:customers/cel:customer[1]/cel:name/cm:givenName',
lastName VARCHAR2(50) path 'cel:customers/cel:customer[1]/cel:name/cm:surName')"
pk="id">
<entity name="IFPPOLICY" processor="SqlEntityProcessor" dataSource="ds1" transformer="ClobTransformer"
query="select object_value.toStringVal() as POLICY_DOC,id,rowid from ifp_policy"
pk="id">
<field column="rowid" name="rowid"/>
<field column="id" name="id"/>
<field column="POLICY_DOC" name="POLICY_DOC" clob="true"/>
<field column="primaryCustomerId" name="primaryCustomerId"/>
<field column="givenName" name="givenName"/>
<field column="lastName" name="lastName"/>
<field column="relationshipCode" name="relationshipCode"/>
</entity> 
<field column="policy" clob="true"/>
Any help appreciated.!
Thanks

Best Regards,
Muniaraja M
(M) : +91 9840329175
Email: muniraja.murugan@cigna.com<ma...@cigna.com>

Confidential, unpublished property of Cigna. Do not duplicate or distribute. Use and distribution limited solely to authorized personnel. © Copyright 2013 by Cigna.
------------------------------------------------------------------------------
CONFIDENTIALITY NOTICE: If you have received this email in error,
please immediately notify the sender by e-mail at the address shown. 
This email transmission may contain confidential information.  This
information is intended only for the use of the individual(s) or entity to
whom it is intended even if addressed incorrectly.  Please delete it from
your files if you are not the intended recipient.  Thank you for your
compliance.  Copyright (c) 2014 Cigna
==============================================================================


Re: ORA-04030: out of process memory when trying to allocate 4032 bytes - Please advise

Posted by Shawn Heisey <so...@elyograg.org>.
On 2/13/2014 9:20 AM, Murugan, Muniraja (CTR) Offshore wrote:
> I am getting "ORA-04030: out of process memory when trying to allocate 4032 bytes" error when I try to index xmltype data from Oracle DB. I have registered in Solr support the ID is "SOLR-5723"

The issue tracker is for bugs and feature requests.  This mailing list 
and other discussion forums are for support.  This is probably not a bug 
in Solr, so we'll handle it here.  I will close the issue, but if it 
turns out that it actually is a bug, we can re-open it.

By default, JDBC tries to buffer the entire result set in RAM.  Unless 
your Java heap size is big enough to handle everything Solr needs as 
well as an entire SQL result, it's going to fail.

You need the batchSize parameter on your dataSource in the dataimport 
config.  Try a value of -1 first, if that doesn't work, try something 
like 500.  I know for sure that a value of -1 works with a MySQL 
database, but I have never used Oracle, so I don't know if it will work.

You might need to get in touch with Oracle to get a different version of 
their JDBC driver. A bug in the driver is much more likely than one in DIH.

http://wiki.apache.org/solr/DataImportHandler#Configuring_JdbcDataSource

Thanks,
Shawn