You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Stephanie Belton <so...@zizou.net> on 2007/01/12 15:52:17 UTC

Lock obtain timeout

Hello,

 

We are getting hundreds of ‘Lock obtain timed out’ errors (see stack trace below) – checking the mailing list archive :

 

INFO: Opening Searcher@92e900 DirectUpdateHandler2

12-Jan-2007 14:36:37 org.apache.solr.core.SolrException log

SEVERE: Exception during commit/optimize:java.io.IOException: Lock obtain timed out: SimpleFSLock@/tmp/lucene-5d12dd782520964674beb001c4877b36-write.lock

        at org.apache.lucene.store.Lock.obtain(Lock.java:69)

        at org.apache.lucene.index.IndexReader.aquireWriteLock(IndexReader.java:516)

        at org.apache.lucene.index.IndexReader.deleteDocument(IndexReader.java:541)

        at org.apache.solr.update.DirectUpdateHandler2.doDeletions(DirectUpdateHandler2.java:459)

        at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:494)

        at org.apache.solr.core.SolrCore.update(SolrCore.java:763)

        at org.apache.solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:53)

        at javax.servlet.http.HttpServlet.service(HttpServlet.java:616)

        at javax.servlet.http.HttpServlet.service(HttpServlet.java:689)

        at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:428)

        at org.mortbay.jetty.servlet.WebApplicationHandler.dispatch(WebApplicationHandler.java:473)

        at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:568)

        at org.mortbay.http.HttpContext.handle(HttpContext.java:1530)

        at org.mortbay.jetty.servlet.WebApplicationContext.handle(WebApplicationContext.java:633)

        at org.mortbay.http.HttpContext.handle(HttpContext.java:1482)

        at org.mortbay.http.HttpServer.service(HttpServer.java:909)

        at org.mortbay.http.HttpConnection.service(HttpConnection.java:820)

        at org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:986)

        at org.mortbay.http.HttpConnection.handle(HttpConnection.java:837)

        at org.mortbay.http.SocketListener.handleConnection(SocketListener.java:245)

        at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:357)

        at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:534)

 

and also many ‘Too Many Files open’ errors such as:

 

12-Jan-2007 14:49:00 org.apache.solr.core.SolrException log

SEVERE: java.io.IOException: Too many open files

        at java.io.UnixFileSystem.createFileExclusively(Native Method)

        at java.io.File.createNewFile(File.java:850)

        at org.apache.lucene.store.SimpleFSLock.obtain(SimpleFSLockFactory.java:122)

        at org.apache.lucene.store.Lock.obtain(Lock.java:60)

        at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:258)

        at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:208)

        at org.apache.solr.update.SolrIndexWriter.<init>(SolrIndexWriter.java:66)

        at org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:119)

        at org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:176)

        at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:247)

        at org.apache.solr.core.SolrCore.update(SolrCore.java:716)

        at org.apache.solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:53)

        at javax.servlet.http.HttpServlet.service(HttpServlet.java:616)

        at javax.servlet.http.HttpServlet.service(HttpServlet.java:689)

        at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:428)

        at org.mortbay.jetty.servlet.WebApplicationHandler.dispatch(WebApplicationHandler.java:473)

        at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:568)

        at org.mortbay.http.HttpContext.handle(HttpContext.java:1530)

        at org.mortbay.jetty.servlet.WebApplicationContext.handle(WebApplicationContext.java:633)

        at org.mortbay.http.HttpContext.handle(HttpContext.java:1482)

        at org.mortbay.http.HttpServer.service(HttpServer.java:909)

        at org.mortbay.http.HttpConnection.service(HttpConnection.java:820)

        at org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:986)

        at org.mortbay.http.HttpConnection.handle(HttpConnection.java:837)

        at org.mortbay.http.SocketListener.handleConnection(SocketListener.java:245)

        at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:357)

        at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:534)

 

12-Jan-2007 14:49:00 org.apache.solr.update.DirectUpdateHandler2 commit

INFO: start commit(optimize=false,waitFlush=false,waitSearcher=true)

12-Jan-2007 14:49:00 org.apache.solr.update.DirectUpdateHandler2 doDeletions

INFO: DirectUpdateHandler2 deleting and removing dups for 5 ids

12-Jan-2007 14:49:00 org.apache.solr.core.SolrException log

SEVERE: Exception during commit/optimize:java.io.FileNotFoundException: /usr/local/solr/slando/solr/data/index/_7ut.tis (Too many open files)

        at java.io.RandomAccessFile.open(Native Method)

        at java.io.RandomAccessFile.<init>(RandomAccessFile.java:212)

        at org.apache.lucene.store.FSIndexInput$Descriptor.<init>(FSDirectory.java:482)

        at org.apache.lucene.store.FSIndexInput.<init>(FSDirectory.java:491)

        at org.apache.lucene.store.FSDirectory.openInput(FSDirectory.java:424)

        at org.apache.lucene.index.TermInfosReader.<init>(TermInfosReader.java:49)

        at org.apache.lucene.index.SegmentReader.initialize(SegmentReader.java:148)

        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:130)

        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:111)

        at org.apache.lucene.index.IndexReader$1.doBody(IndexReader.java:153)

        at org.apache.lucene.store.Lock$With.run(Lock.java:116)

        at org.apache.lucene.index.IndexReader.open(IndexReader.java:142)

        at org.apache.lucene.index.IndexReader.open(IndexReader.java:126)

        at org.apache.solr.search.SolrIndexSearcher.<init>(SolrIndexSearcher.java:85)

        at org.apache.solr.core.SolrCore.newSearcher(SolrCore.java:117)

        at org.apache.solr.update.DirectUpdateHandler2.openSearcher(DirectUpdateHandler2.java:194)

        at org.apache.solr.update.DirectUpdateHandler2.doDeletions(DirectUpdateHandler2.java:421)

        at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:494)

        at org.apache.solr.core.SolrCore.update(SolrCore.java:763)

        at org.apache.solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:53)

        at javax.servlet.http.HttpServlet.service(HttpServlet.java:616)

        at javax.servlet.http.HttpServlet.service(HttpServlet.java:689)

        at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:428)

        at org.mortbay.jetty.servlet.WebApplicationHandler.dispatch(WebApplicationHandler.java:473)

        at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:568)

        at org.mortbay.http.HttpContext.handle(HttpContext.java:1530)

        at org.mortbay.jetty.servlet.WebApplicationContext.handle(WebApplicationContext.java:633)

        at org.mortbay.http.HttpContext.handle(HttpContext.java:1482)

        at org.mortbay.http.HttpServer.service(HttpServer.java:909)

        at org.mortbay.http.HttpConnection.service(HttpConnection.java:820)

        at org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:986)

        at org.mortbay.http.HttpConnection.handle(HttpConnection.java:837)

        at org.mortbay.http.SocketListener.handleConnection(SocketListener.java:245)

        at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:357)

        at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:534)

 

Are the two problems related? Looking through the mailing list it seems that changing the settings for useCompoundFile from false to true could help but before I do that I would like to understand if there are undesirable side effects, what isn’t this param set to true by default?

 

Thanks

Stephanie


RE: Lock obtain timeout

Posted by Stephanie Belton <so...@zizou.net>.
Thanks for that - I have made the following changes:

- optimize more often
- omitNorms on all non-fulltext fields
- useCompoundfFile=true (will keep an eye on performance)

And that seems to have solved the problem.

-----Original Message-----
From: Chris Hostetter [mailto:hossman_lucene@fucit.org] 
Sent: 13 January 2007 01:37
To: solr-user@lucene.apache.org
Subject: Re: Lock obtain timeout



: Are the two problems related? Looking through the mailing list it seems
: that changing the settings for useCompoundFile from false to true could
: help but before I do that I would like to understand if there are
: undesirable side effects, what isn’t this param set to true by
: default?

Too Many Open Files can result from lots of different possible reasons:
one is that you have soo many indexed fields with norms that the number of
files in your index is too big -- that's the use case where
useCompoundFile=true can help you -- but it's not set that way be default
because it can make searching slower.

the other reason why you can have too many open files is if you are
getting more concurrent requests then you can handle -- or if the clients
initiating those requests aren't closing them properly (sockets count as
files too)

understanding why you are getting these errors requires that you look at
what your hard and soft file limits are (ulimit -aH and ulimit -aS on my
system) and what files are in use by Solr when these errors occur (lsof -p
_solrpid_).

to answer your earlier question, i *think* you may be getting the lock
timeout errors because it can't access the lock file, because it can't
open any more files ... i'm not 100% sure.




-Hoss




Re: Lock obtain timeout

Posted by Chris Hostetter <ho...@fucit.org>.

: Are the two problems related? Looking through the mailing list it seems
: that changing the settings for useCompoundFile from false to true could
: help but before I do that I would like to understand if there are
: undesirable side effects, what isn’t this param set to true by
: default?

Too Many Open Files can result from lots of different possible reasons:
one is that you have soo many indexed fields with norms that the number of
files in your index is too big -- that's the use case where
useCompoundFile=true can help you -- but it's not set that way be default
because it can make searching slower.

the other reason why you can have too many open files is if you are
getting more concurrent requests then you can handle -- or if the clients
initiating those requests aren't closing them properly (sockets count as
files too)

understanding why you are getting these errors requires that you look at
what your hard and soft file limits are (ulimit -aH and ulimit -aS on my
system) and what files are in use by Solr when these errors occur (lsof -p
_solrpid_).

to answer your earlier question, i *think* you may be getting the lock
timeout errors because it can't access the lock file, because it can't
open any more files ... i'm not 100% sure.




-Hoss