You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Forest Soup (JIRA)" <ji...@apache.org> on 2016/12/05 02:59:58 UTC

[jira] [Created] (SOLR-9825) Solr should not return HTTP 400 for some cases

Forest Soup created SOLR-9825:
---------------------------------

             Summary: Solr should not return HTTP 400 for some cases
                 Key: SOLR-9825
                 URL: https://issues.apache.org/jira/browse/SOLR-9825
             Project: Solr
          Issue Type: Bug
      Security Level: Public (Default Security Level. Issues are Public)
    Affects Versions: 5.3
            Reporter: Forest Soup


For some cases, when solr handling requests, it should not always return http 400.  We met several cases, here is the recent two:

Case 1:  When adding a doc, if there is runtime error happens, even it's a solr internal issue, it returns http 400 to confuse the client. Actually the request is good, while IndexWriter is closed. 

The exception stack is:
2016-11-22 21:23:32.858 ERROR (qtp2011912080-83) [c:collection12 s:shard1 r:core_node1 x:collection12_shard1_replica1] o.a.s.c.SolrCore org.apache.solr.common.SolrException: Exception writing document id Q049dXMxYjMtbWFpbDg4L089bGxuX3VzMQ==20824042!8918AB024CF638F685257DDC00074D78 to the index; possible analysis error.
	at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:167)
	at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:69)
	at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
	at org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:955)
	at org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1110)
	at org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:706)
	at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
	at org.apache.solr.update.processor.LanguageIdentifierUpdateProcessor.processAdd(LanguageIdentifierUpdateProcessor.java:207)
	at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
	at org.apache.solr.update.processor.CloneFieldUpdateProcessorFactory$1.processAdd(CloneFieldUpdateProcessorFactory.java:231)
	at org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.processUpdate(JsonLoader.java:143)
	at org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.load(JsonLoader.java:113)
	at org.apache.solr.handler.loader.JsonLoader.load(JsonLoader.java:76)
	at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:98)
	at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:74)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2068)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:672)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:463)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:235)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:199)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
	at org.eclipse.jetty.server.Server.handle(Server.java:499)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.lucene.store.AlreadyClosedException: this IndexWriter is closed
	at org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:719)
	at org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:733)
	at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1471)
	at org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:239)
	at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:163)
	... 40 more
Caused by: java.nio.channels.ClosedByInterruptException
	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202)
	at sun.nio.ch.FileChannelImpl.size(FileChannelImpl.java:315)
	at org.apache.lucene.store.NativeFSLockFactory$NativeFSLock.ensureValid(NativeFSLockFactory.java:170)
	at org.apache.lucene.store.LockValidatingDirectoryWrapper.createOutput(LockValidatingDirectoryWrapper.java:43)
	at org.apache.lucene.store.TrackingDirectoryWrapper.createOutput(TrackingDirectoryWrapper.java:43)
	at org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter.<init>(BlockTreeTermsWriter.java:328)
	at org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter.<init>(BlockTreeTermsWriter.java:280)
	at org.apache.lucene.codecs.lucene50.Lucene50PostingsFormat.fieldsConsumer(Lucene50PostingsFormat.java:428)
	at org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsWriter.write(PerFieldPostingsFormat.java:196)
	at org.apache.lucene.index.FreqProxTermsWriter.flush(FreqProxTermsWriter.java:107)
	at org.apache.lucene.index.DefaultIndexingChain.flush(DefaultIndexingChain.java:112)
	at org.apache.lucene.index.DocumentsWriterPerThread.flush(DocumentsWriterPerThread.java:422)
	at org.apache.lucene.index.DocumentsWriter.doFlush(DocumentsWriter.java:503)
	at org.apache.lucene.index.DocumentsWriter.flushAllThreads(DocumentsWriter.java:615)
	at org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:423)
	at org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:273)
	at org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:203)
	at org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1581)
	at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1773)
	at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:609)
	at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	... 1 more


The code is:
  public int addDoc(AddUpdateCommand cmd) throws IOException {
    try {
      return addDoc0(cmd);
    } catch (SolrException e) {
      throw e;
    } catch (RuntimeException t) {
      throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,
          String.format(Locale.ROOT, "Exception writing document id %s to the index; possible analysis error.",
          cmd.getPrintableId()), t);
    }
  }

Case 2: For solrcloud, when ZK session has issues or core node fails to register itself to zk, it's actually an internal issue or IO issue. 503/500/404 are all acceptable, but 400 is not, as it will make client confuse. 

code: 
  public static DocCollection getCollectionLive(ZkStateReader zkStateReader,
      String coll) {
    String collectionPath = getCollectionPath(coll);
    try {
      Stat stat = new Stat();
      byte[] data = zkStateReader.getZkClient().getData(collectionPath, null, stat, true);
      ClusterState state = ClusterState.load(stat.getVersion(), data,
          Collections.<String> emptySet(), collectionPath);
      ClusterState.CollectionRef collectionRef = state.getCollectionStates().get(coll);
      return collectionRef == null ? null : collectionRef.get();
    } catch (KeeperException.NoNodeException e) {
      log.warn("No node available : " + collectionPath, e);
      return null;
    } catch (KeeperException e) {
      throw new SolrException(ErrorCode.BAD_REQUEST,
          "Could not load collection from ZK:" + coll, e);
    } catch (InterruptedException e) {
      Thread.currentThread().interrupt();
      throw new SolrException(ErrorCode.BAD_REQUEST,
          "Could not load collection from ZK:" + coll, e);
    }
  }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org