You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@solr.apache.org by "Jan Høydahl (Jira)" <ji...@apache.org> on 2021/06/11 14:19:00 UTC

[jira] [Resolved] (SOLR-15252) Solr should log WARN log when a query requests huge rows number

     [ https://issues.apache.org/jira/browse/SOLR-15252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jan Høydahl resolved SOLR-15252.
--------------------------------
    Resolution: Won't Do

Closing this as Won't do. I hope others will proceed with one of the proposed, more complex proposals, as today users are not getting any warning about bad usage whatsoever.

> Solr should log WARN log when a query requests huge rows number
> ---------------------------------------------------------------
>
>                 Key: SOLR-15252
>                 URL: https://issues.apache.org/jira/browse/SOLR-15252
>             Project: Solr
>          Issue Type: Improvement
>          Components: query
>            Reporter: Jan Høydahl
>            Assignee: Jan Høydahl
>            Priority: Major
>          Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> We have all seen it - clients that use Integer.MAX_VALUE or 10000000 as rows parameter, to just make sure they get all possible results. And this of course leads to high GC pauses since Lucene allocates an array up front to hold results.
> Solr should either log WARN when it encounters a value above a certain threshold, such as 100k (then you should use cursormark instead). Or it should simply respond with 400 error and have a system property or query parameter folks can use to override if they know what they are doing.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@solr.apache.org
For additional commands, e-mail: issues-help@solr.apache.org