You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@solr.apache.org by "David Smiley (Jira)" <ji...@apache.org> on 2021/03/12 18:06:00 UTC

[jira] [Commented] (SOLR-15252) Solr should log WARN log when a query requests huge rows number

    [ https://issues.apache.org/jira/browse/SOLR-15252?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17300514#comment-17300514 ] 

David Smiley commented on SOLR-15252:
-------------------------------------

This is a search anti-pattern indeed.  I propose this be handled with a new threshold like "maxRows" in solr.xml/solrconfig.xml similar to maxBooleanClauses.  Default should probably be 10k based on what I recall was observed in benchmarks for cursor paging.

> Solr should log WARN log when a query requests huge rows number
> ---------------------------------------------------------------
>
>                 Key: SOLR-15252
>                 URL: https://issues.apache.org/jira/browse/SOLR-15252
>             Project: Solr
>          Issue Type: Improvement
>      Security Level: Public(Default Security Level. Issues are Public) 
>          Components: query
>            Reporter: Jan Høydahl
>            Assignee: Jan Høydahl
>            Priority: Major
>
> We have all seen it - clients that use Integer.MAX_VALUE or 10000000 as rows parameter, to just make sure they get all possible results. And this of course leads to high GC pauses since Lucene allocates an array up front to hold results.
> Solr should either log WARN when it encounters a value above a certain threshold, such as 100k (then you should use cursormark instead). Or it should simply respond with 400 error and have a system property or query parameter folks can use to override if they know what they are doing.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)