You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Mustafa KIZILDAĞ <mu...@gmail.com> on 2015/05/29 19:34:54 UTC

user interface

Hi,

My name is Mustafa. I'm a master student at YTU in Turkey. I am doing a
crawler for Voip problem for my job and scholl. I want to configure Solr's
user interface. For example, I want to add an image or add a comment on
user interface?

I searched about it but could't find a good result.

Could you help me,

Best Regards.

Mustafa KIZILDAĞ

Re: user interface

Posted by Erik Hatcher <er...@gmail.com>.
Which user interface?  Do you mean the admin UI?   Or perhaps /browse?  


—
Erik Hatcher, Senior Solutions Architect
http://www.lucidworks.com <http://www.lucidworks.com/>




> On May 29, 2015, at 1:34 PM, Mustafa KIZILDAĞ <mu...@gmail.com> wrote:
> 
> Hi,
> 
> My name is Mustafa. I'm a master student at YTU in Turkey. I am doing a
> crawler for Voip problem for my job and scholl. I want to configure Solr's
> user interface. For example, I want to add an image or add a comment on
> user interface?
> 
> I searched about it but could't find a good result.
> 
> Could you help me,
> 
> Best Regards.
> 
> Mustafa KIZILDAĞ


Re: Deleting Fields

Posted by Charlie Hull <ch...@flax.co.uk>.
On 30/05/2015 00:30, Shawn Heisey wrote:
> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
>> Hi All - I have a lot of fields to delete, but noticed that once I
>> started deleting them, I quickly ran out of heap space.  Is
>> delete-field a memory intensive operation?  Should I delete one field,
>> wait a while, then delete the next?
>
> I'm not aware of a way to delete a field.  I may have a different
> definition of what a field is than you do, though.
>
> Solr lets you delete entire documents, but deleting a field from the
> entire index would involve re-indexing every document in the index,
> excluding that field.
>
> Can you be more specific about exactly what you are doing, what you are
> seeing, and what you want to see instead?
>
> Also, please be aware of this:
>
> http://people.apache.org/~hossman/#threadhijack
>
> Thanks,
> Shawn
>
Here's a rather old post on how we did something similar:
http://www.flax.co.uk/blog/2011/06/24/how-to-remove-a-stored-field-in-lucene/

Cheers

Charlie

-- 
Charlie Hull
Flax - Open Source Enterprise Search

tel/fax: +44 (0)8700 118334
mobile:  +44 (0)7767 825828
web: www.flax.co.uk

Re: Deleting Fields

Posted by Steve Rowe <sa...@gmail.com>.
Hi Joseph,

> On May 30, 2015, at 8:18 AM, Joseph Obernberger <jo...@lovehorsepower.com> wrote:
> 
> Thank you Erick.  I was thinking that it actually went through and removed the index data; that you for the clarification.

I added more info to the Schema API page about this not being true.  Here’s what I’ve got so far - let me know if you think we should add more warnings about this:

-----
Re-index after schema modifications!

If you modify your schema, you will likely need to re-index all documents. If you do not, you may lose access to documents, or not be able to interpret them properly, e.g. after replacing a field type.

Modifying your schema will never modify any documents that are already indexed. Again, you must re-index documents in order to apply schema changes to them.

[…]

When modifying the schema with the API, a core reload will automatically occur in order for the changes to be available immediately for documents indexed thereafter.  Previously indexed documents will not be automatically handled - they must be re-indexed if they used schema elements that you changed.
-----

Steve

Re: Deleting Fields

Posted by Joseph Obernberger <jo...@lovehorsepower.com>.
Hi - we are using 64bit OS and 64bit JVM.  The JVM settings are currently:
-----------------------------------------
-DSTOP.KEY=solrrocks
-DSTOP.PORT=8100
-Dhost=helios
-Djava.net.preferIPv4Stack=true
-Djetty.port=9100
-DnumShards=27
-Dsolr.clustering.enabled=true
-Dsolr.install.dir=/opt/solr
-Dsolr.lock.type=hdfs
-Dsolr.solr.home=/opt/solr/server/solr
-Duser.timezone=UTC-DzkClientTimeout=15000
-DzkHost=eris.querymasters.com:2181,daphnis.querymasters.com:2181,triton.querymasters.com:2181,oberon.querymasters.com:2181,portia.querymasters.com:2181,puck.querymasters.com:2181/solr5
-XX:+CMSParallelRemarkEnabled
-XX:+CMSScavengeBeforeRemark
-XX:+ParallelRefProcEnabled
-XX:+PrintGCApplicationStoppedTime
-XX:+PrintGCDateStamps
-XX:+PrintGCDetails
-XX:+PrintGCTimeStamps
-XX:+PrintHeapAtGC
-XX:+PrintTenuringDistribution
-XX:+UseCMSInitiatingOccupancyOnly
-XX:+UseConcMarkSweepGC
-XX:+UseLargePages
-XX:+UseParNewGC-XX:CMSFullGCsBeforeCompaction=1
-XX:CMSInitiatingOccupancyFraction=50
-XX:CMSMaxAbortablePrecleanTime=6000
-XX:CMSTriggerPermRatio=80
-XX:ConcGCThreads=8
-XX:MaxDirectMemorySize=26g
-XX:MaxTenuringThreshold=8
-XX:NewRatio=3
-XX:OnOutOfMemoryError=/opt/solr/bin/oom_solr.sh 9100 /opt/solr/server/logs
-XX:ParallelGCThreads=8
-XX:PretenureSizeThreshold=64m
-XX:SurvivorRatio=4
-XX:TargetSurvivorRatio=90
-Xloggc:/opt/solr/server/logs/solr_gc.log
-Xms8g
-Xmx16g
-Xss256k
-verbose:gc
-----------------------------------------

At the time out of the OOM error, Xmx was set to 10g.  OS limits, for 
the most part, are 'factory' Scientific Linux 6.6.  I didn't see any 
messages in the log about too many open files.  Thank you for the tips!

-Joe

On 5/31/2015 4:24 AM, Tomasz Borek wrote:
> Joseph,
>
> You are doing a memory intensive operation and perhaps an IO intensive
> operation at once. That makes your C-heap run out of memory or hit a thread
> limit (thus first problem, java.lang.OutOfMemoryError: unable to create new
> native thread) and later you're also hitting the problem of Java heap being
> full or - more precisely - GC being unable to free enough space there even
> with collection, to allocate new object that you want allocated (thus
> second throw: java.lang.OutOfMemoryError: Java heap space).
>
> Important is:
> - whether your OS is 32-bit or 64-bit
> - whether your JVM is 32-bit or 64-bit
> - what are your OS limits on thread creation and have you touched them or
> changed them (1st problem)
> - how do you start JVM (Xms, Xmx, Xss, dumps, direct memory, permgen size -
> both problems)
>
> What you can do to solve your problems differs depending on what exactly
> causes them, but in general:
>
> NATIVE:
> 1) Either your operation causes many threads to spawn and you hit your
> thread limit (OS limits how many threads process can create) - have a
> threaddump and see
> 2) Or your op causes many thread creation and the memory settings you start
> JVM with plus 32/64 bits of OS and JVM make it impossible for the C-heap to
> have this much memory thus you're hitting the OOM error - adjust settings,
> move to 64-bit architectures, add RAM while on 64-bit (32-bit really chokes
> you down, less than 4GB is available for you for EVERYTHING: Java heap,
> PermGen space AND C-Heap)
>
> Usually, with such thread-greedy operation, it's also nice to look at the
> code and see if perhaps one can optimize thread creation/management.
>
> OOM on Java heap:
> Add crash on memory dump parameter to your JVM and walk dominator tree or
> see histogram to actually tell what's eating your heap space. MAT is a good
> tool for this, unless your heap is like 150GB, then Netbeans may help or
> see Alexey Ragozin's work, I think he forked the code for NetBeans heap
> analyzer and made some adjustments specially for such cases. Light Google
> search and here it is: http://blog.ragozin.info/
>
>
> pozdrawiam,
> LAFK
>
> 2015-05-30 20:48 GMT+02:00 Erick Erickson <er...@gmail.com>:
>
>> Faceting on very high cardinality fields can use up memory, no doubt
>> about that. I think the entire delete question was a red herring, but
>> you know that already ;)....
>>
>> So I think you can forget about the delete stuff. Although do note
>> that if you do re-index your old documents, the new version won't have
>> the field, and as segments are merged the deleted documents will have
>> all their resources reclaimed, effectively deleting the field from the
>> old docs.... So you could gradually re-index your corpus and get this
>> stuff out of there.
>>
>> Best,
>> Erick
>>
>> On Sat, May 30, 2015 at 5:18 AM, Joseph Obernberger
>> <jo...@lovehorsepower.com> wrote:
>>> Thank you Erick.  I was thinking that it actually went through and
>> removed
>>> the index data; that you for the clarification.  What happened was I had
>>> some bad data that created a lot of fields (some 8000).  I was getting
>> some
>>> errors adding new fields where solr could not talk to zookeeper, and I
>>> thought it may be because there are so many fields.  The index size is
>> some
>>> 420million docs.
>>> I'm hesitant to try to re-create as when the shards crash, they leave a
>>> write.lock file in HDFS, and I need to manually delete that file (on 27
>>> machines) before bringing them back up.
>>> I believe this is the stack trace - but this looks to be related to
>> facets,
>>> and I'm not 100% sure that this is the correct trace!  Sorry - I if it
>>> happens again I will update.
>>>
>>> ERROR - 2015-05-29 20:39:34.707; [UNCLASS shard9 core_node14 UNCLASS]
>>> org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
>>> java.lang.OutOfMemoryError: unable to create new native thread
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
>>>          at
>>>
>> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
>>>          at
>>>
>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
>>>          at
>>>
>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
>>>          at
>>>
>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
>>>          at
>>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
>>>          at
>>>
>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>>>          at org.eclipse.jetty.server.Server.handle(Server.java:368)
>>>          at
>>>
>> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
>>>          at
>>>
>> org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
>>>          at
>>>
>> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
>>>          at
>>>
>> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
>>>          at
>> org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
>>>          at
>>> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
>>>          at
>>>
>> org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
>>>          at
>>>
>> org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
>>>          at
>>>
>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>>>          at
>>>
>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>>>          at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.OutOfMemoryError: unable to create new native thread
>>>          at java.lang.Thread.start0(Native Method)
>>>          at java.lang.Thread.start(Thread.java:714)
>>>          at
>>>
>> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949)
>>>          at
>>>
>> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1371)
>>>          at
>>>
>> org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:637)
>>>          at
>>>
>> org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:280)
>>>          at
>>>
>> org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:106)
>>>          at
>>>
>> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:222)
>>>          at
>>>
>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
>>>          at org.apache.solr.core.SolrCore.execute(SolrCore.java:1984)
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:829)
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:446)
>>>          ... 26 more
>>>
>>> Then later:
>>>
>>> ERROR - 2015-05-29 21:57:22.370; [UNCLASS shard9 core_node14 UNCLASS]
>>> org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
>>> java.lang.OutOfMemoryError: Java heap space
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
>>>          at
>>>
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
>>>          at
>>>
>> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
>>>          at
>>>
>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
>>>          at
>>>
>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
>>>          at
>>>
>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
>>>          at
>>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
>>>          at
>>>
>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
>>>          at
>>>
>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>>>          at org.eclipse.jetty.server.Server.handle(Server.java:368)
>>>          at
>>>
>> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
>>>          at
>>>
>> org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
>>>          at
>>>
>> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
>>>          at
>>>
>> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
>>>          at
>> org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
>>>          at
>>> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
>>>          at
>>>
>> org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
>>>          at
>>>
>> org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
>>>          at
>>>
>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>>>          at
>>>
>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>>>          at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.OutOfMemoryError: Java heap space
>>>
>>>
>>>
>>> -Joe
>>>
>>>
>>> On 5/30/2015 12:32 AM, Erick Erickson wrote:
>>>> Yes, but deleting fields from the schema only means that _future_
>>>> documents will throw an "undefined field" error. All the documents
>>>> currently in the index will retain that field.
>>>>
>>>> Why you're hitting an OOM is a mystery though. But delete field isn't
>>>> removing the contents if indexed documents. Showing us the full stack
>>>> when you hit an OOM would be helpful.
>>>>
>>>> Best,
>>>> Erick
>>>>
>>>> On Fri, May 29, 2015 at 4:58 PM, Joseph Obernberger
>>>> <jo...@lovehorsepower.com> wrote:
>>>>> Thank you Shawn - I'm referring to fields in the schema.  With Solr 5,
>>>>> you
>>>>> can delete fields from the schema.
>>>>>
>>>>>
>> https://cwiki.apache.org/confluence/display/solr/Schema+API#SchemaAPI-DeleteaField
>>>>> -Joe
>>>>>
>>>>>
>>>>> On 5/29/2015 7:30 PM, Shawn Heisey wrote:
>>>>>> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
>>>>>>> Hi All - I have a lot of fields to delete, but noticed that once I
>>>>>>> started deleting them, I quickly ran out of heap space.  Is
>>>>>>> delete-field a memory intensive operation?  Should I delete one
>> field,
>>>>>>> wait a while, then delete the next?
>>>>>> I'm not aware of a way to delete a field.  I may have a different
>>>>>> definition of what a field is than you do, though.
>>>>>>
>>>>>> Solr lets you delete entire documents, but deleting a field from the
>>>>>> entire index would involve re-indexing every document in the index,
>>>>>> excluding that field.
>>>>>>
>>>>>> Can you be more specific about exactly what you are doing, what you
>> are
>>>>>> seeing, and what you want to see instead?
>>>>>>
>>>>>> Also, please be aware of this:
>>>>>>
>>>>>> http://people.apache.org/~hossman/#threadhijack
>>>>>>
>>>>>> Thanks,
>>>>>> Shawn
>>>>>>
>>>>>>


Re: Deleting Fields

Posted by Tomasz Borek <to...@gmail.com>.
Joseph,

You are doing a memory intensive operation and perhaps an IO intensive
operation at once. That makes your C-heap run out of memory or hit a thread
limit (thus first problem, java.lang.OutOfMemoryError: unable to create new
native thread) and later you're also hitting the problem of Java heap being
full or - more precisely - GC being unable to free enough space there even
with collection, to allocate new object that you want allocated (thus
second throw: java.lang.OutOfMemoryError: Java heap space).

Important is:
- whether your OS is 32-bit or 64-bit
- whether your JVM is 32-bit or 64-bit
- what are your OS limits on thread creation and have you touched them or
changed them (1st problem)
- how do you start JVM (Xms, Xmx, Xss, dumps, direct memory, permgen size -
both problems)

What you can do to solve your problems differs depending on what exactly
causes them, but in general:

NATIVE:
1) Either your operation causes many threads to spawn and you hit your
thread limit (OS limits how many threads process can create) - have a
threaddump and see
2) Or your op causes many thread creation and the memory settings you start
JVM with plus 32/64 bits of OS and JVM make it impossible for the C-heap to
have this much memory thus you're hitting the OOM error - adjust settings,
move to 64-bit architectures, add RAM while on 64-bit (32-bit really chokes
you down, less than 4GB is available for you for EVERYTHING: Java heap,
PermGen space AND C-Heap)

Usually, with such thread-greedy operation, it's also nice to look at the
code and see if perhaps one can optimize thread creation/management.

OOM on Java heap:
Add crash on memory dump parameter to your JVM and walk dominator tree or
see histogram to actually tell what's eating your heap space. MAT is a good
tool for this, unless your heap is like 150GB, then Netbeans may help or
see Alexey Ragozin's work, I think he forked the code for NetBeans heap
analyzer and made some adjustments specially for such cases. Light Google
search and here it is: http://blog.ragozin.info/


pozdrawiam,
LAFK

2015-05-30 20:48 GMT+02:00 Erick Erickson <er...@gmail.com>:

> Faceting on very high cardinality fields can use up memory, no doubt
> about that. I think the entire delete question was a red herring, but
> you know that already ;)....
>
> So I think you can forget about the delete stuff. Although do note
> that if you do re-index your old documents, the new version won't have
> the field, and as segments are merged the deleted documents will have
> all their resources reclaimed, effectively deleting the field from the
> old docs.... So you could gradually re-index your corpus and get this
> stuff out of there.
>
> Best,
> Erick
>
> On Sat, May 30, 2015 at 5:18 AM, Joseph Obernberger
> <jo...@lovehorsepower.com> wrote:
> > Thank you Erick.  I was thinking that it actually went through and
> removed
> > the index data; that you for the clarification.  What happened was I had
> > some bad data that created a lot of fields (some 8000).  I was getting
> some
> > errors adding new fields where solr could not talk to zookeeper, and I
> > thought it may be because there are so many fields.  The index size is
> some
> > 420million docs.
> > I'm hesitant to try to re-create as when the shards crash, they leave a
> > write.lock file in HDFS, and I need to manually delete that file (on 27
> > machines) before bringing them back up.
> > I believe this is the stack trace - but this looks to be related to
> facets,
> > and I'm not 100% sure that this is the correct trace!  Sorry - I if it
> > happens again I will update.
> >
> > ERROR - 2015-05-29 20:39:34.707; [UNCLASS shard9 core_node14 UNCLASS]
> > org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
> > java.lang.OutOfMemoryError: unable to create new native thread
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
> >         at
> >
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
> >         at
> >
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
> >         at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
> >         at
> >
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
> >         at
> >
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
> >         at
> >
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
> >         at
> > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
> >         at
> >
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
> >         at
> >
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
> >         at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
> >         at
> >
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
> >         at
> >
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
> >         at
> >
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
> >         at org.eclipse.jetty.server.Server.handle(Server.java:368)
> >         at
> >
> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
> >         at
> >
> org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
> >         at
> >
> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
> >         at
> >
> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
> >         at
> org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
> >         at
> > org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
> >         at
> >
> org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
> >         at
> >
> org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
> >         at
> >
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
> >         at
> >
> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
> >         at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.OutOfMemoryError: unable to create new native thread
> >         at java.lang.Thread.start0(Native Method)
> >         at java.lang.Thread.start(Thread.java:714)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1371)
> >         at
> >
> org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:637)
> >         at
> >
> org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:280)
> >         at
> >
> org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:106)
> >         at
> >
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:222)
> >         at
> >
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
> >         at org.apache.solr.core.SolrCore.execute(SolrCore.java:1984)
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:829)
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:446)
> >         ... 26 more
> >
> > Then later:
> >
> > ERROR - 2015-05-29 21:57:22.370; [UNCLASS shard9 core_node14 UNCLASS]
> > org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
> > java.lang.OutOfMemoryError: Java heap space
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
> >         at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
> >         at
> >
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
> >         at
> >
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
> >         at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
> >         at
> >
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
> >         at
> >
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
> >         at
> >
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
> >         at
> > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
> >         at
> >
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
> >         at
> >
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
> >         at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
> >         at
> >
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
> >         at
> >
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
> >         at
> >
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
> >         at org.eclipse.jetty.server.Server.handle(Server.java:368)
> >         at
> >
> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
> >         at
> >
> org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
> >         at
> >
> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
> >         at
> >
> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
> >         at
> org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
> >         at
> > org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
> >         at
> >
> org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
> >         at
> >
> org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
> >         at
> >
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
> >         at
> >
> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
> >         at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.OutOfMemoryError: Java heap space
> >
> >
> >
> > -Joe
> >
> >
> > On 5/30/2015 12:32 AM, Erick Erickson wrote:
> >>
> >> Yes, but deleting fields from the schema only means that _future_
> >> documents will throw an "undefined field" error. All the documents
> >> currently in the index will retain that field.
> >>
> >> Why you're hitting an OOM is a mystery though. But delete field isn't
> >> removing the contents if indexed documents. Showing us the full stack
> >> when you hit an OOM would be helpful.
> >>
> >> Best,
> >> Erick
> >>
> >> On Fri, May 29, 2015 at 4:58 PM, Joseph Obernberger
> >> <jo...@lovehorsepower.com> wrote:
> >>>
> >>> Thank you Shawn - I'm referring to fields in the schema.  With Solr 5,
> >>> you
> >>> can delete fields from the schema.
> >>>
> >>>
> https://cwiki.apache.org/confluence/display/solr/Schema+API#SchemaAPI-DeleteaField
> >>>
> >>> -Joe
> >>>
> >>>
> >>> On 5/29/2015 7:30 PM, Shawn Heisey wrote:
> >>>>
> >>>> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
> >>>>>
> >>>>> Hi All - I have a lot of fields to delete, but noticed that once I
> >>>>> started deleting them, I quickly ran out of heap space.  Is
> >>>>> delete-field a memory intensive operation?  Should I delete one
> field,
> >>>>> wait a while, then delete the next?
> >>>>
> >>>> I'm not aware of a way to delete a field.  I may have a different
> >>>> definition of what a field is than you do, though.
> >>>>
> >>>> Solr lets you delete entire documents, but deleting a field from the
> >>>> entire index would involve re-indexing every document in the index,
> >>>> excluding that field.
> >>>>
> >>>> Can you be more specific about exactly what you are doing, what you
> are
> >>>> seeing, and what you want to see instead?
> >>>>
> >>>> Also, please be aware of this:
> >>>>
> >>>> http://people.apache.org/~hossman/#threadhijack
> >>>>
> >>>> Thanks,
> >>>> Shawn
> >>>>
> >>>>
> >
>

Re: Deleting Fields

Posted by Erick Erickson <er...@gmail.com>.
Faceting on very high cardinality fields can use up memory, no doubt
about that. I think the entire delete question was a red herring, but
you know that already ;)....

So I think you can forget about the delete stuff. Although do note
that if you do re-index your old documents, the new version won't have
the field, and as segments are merged the deleted documents will have
all their resources reclaimed, effectively deleting the field from the
old docs.... So you could gradually re-index your corpus and get this
stuff out of there.

Best,
Erick

On Sat, May 30, 2015 at 5:18 AM, Joseph Obernberger
<jo...@lovehorsepower.com> wrote:
> Thank you Erick.  I was thinking that it actually went through and removed
> the index data; that you for the clarification.  What happened was I had
> some bad data that created a lot of fields (some 8000).  I was getting some
> errors adding new fields where solr could not talk to zookeeper, and I
> thought it may be because there are so many fields.  The index size is some
> 420million docs.
> I'm hesitant to try to re-create as when the shards crash, they leave a
> write.lock file in HDFS, and I need to manually delete that file (on 27
> machines) before bringing them back up.
> I believe this is the stack trace - but this looks to be related to facets,
> and I'm not 100% sure that this is the correct trace!  Sorry - I if it
> happens again I will update.
>
> ERROR - 2015-05-29 20:39:34.707; [UNCLASS shard9 core_node14 UNCLASS]
> org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
> java.lang.OutOfMemoryError: unable to create new native thread
>         at
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
>         at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
>         at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
>         at
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
>         at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
>         at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
>         at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
>         at
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
>         at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
>         at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
>         at
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
>         at
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
>         at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>         at
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>         at
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
>         at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>         at org.eclipse.jetty.server.Server.handle(Server.java:368)
>         at
> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
>         at
> org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
>         at
> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
>         at
> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
>         at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
>         at
> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
>         at
> org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
>         at
> org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
>         at
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>         at
> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.OutOfMemoryError: unable to create new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(Thread.java:714)
>         at
> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949)
>         at
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1371)
>         at
> org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:637)
>         at
> org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:280)
>         at
> org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:106)
>         at
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:222)
>         at
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
>         at org.apache.solr.core.SolrCore.execute(SolrCore.java:1984)
>         at
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:829)
>         at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:446)
>         ... 26 more
>
> Then later:
>
> ERROR - 2015-05-29 21:57:22.370; [UNCLASS shard9 core_node14 UNCLASS]
> org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
> java.lang.OutOfMemoryError: Java heap space
>         at
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
>         at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
>         at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
>         at
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
>         at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
>         at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
>         at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
>         at
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
>         at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
>         at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
>         at
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
>         at
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
>         at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>         at
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>         at
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
>         at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>         at org.eclipse.jetty.server.Server.handle(Server.java:368)
>         at
> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
>         at
> org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
>         at
> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
>         at
> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
>         at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
>         at
> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
>         at
> org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
>         at
> org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
>         at
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>         at
> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.OutOfMemoryError: Java heap space
>
>
>
> -Joe
>
>
> On 5/30/2015 12:32 AM, Erick Erickson wrote:
>>
>> Yes, but deleting fields from the schema only means that _future_
>> documents will throw an "undefined field" error. All the documents
>> currently in the index will retain that field.
>>
>> Why you're hitting an OOM is a mystery though. But delete field isn't
>> removing the contents if indexed documents. Showing us the full stack
>> when you hit an OOM would be helpful.
>>
>> Best,
>> Erick
>>
>> On Fri, May 29, 2015 at 4:58 PM, Joseph Obernberger
>> <jo...@lovehorsepower.com> wrote:
>>>
>>> Thank you Shawn - I'm referring to fields in the schema.  With Solr 5,
>>> you
>>> can delete fields from the schema.
>>>
>>> https://cwiki.apache.org/confluence/display/solr/Schema+API#SchemaAPI-DeleteaField
>>>
>>> -Joe
>>>
>>>
>>> On 5/29/2015 7:30 PM, Shawn Heisey wrote:
>>>>
>>>> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
>>>>>
>>>>> Hi All - I have a lot of fields to delete, but noticed that once I
>>>>> started deleting them, I quickly ran out of heap space.  Is
>>>>> delete-field a memory intensive operation?  Should I delete one field,
>>>>> wait a while, then delete the next?
>>>>
>>>> I'm not aware of a way to delete a field.  I may have a different
>>>> definition of what a field is than you do, though.
>>>>
>>>> Solr lets you delete entire documents, but deleting a field from the
>>>> entire index would involve re-indexing every document in the index,
>>>> excluding that field.
>>>>
>>>> Can you be more specific about exactly what you are doing, what you are
>>>> seeing, and what you want to see instead?
>>>>
>>>> Also, please be aware of this:
>>>>
>>>> http://people.apache.org/~hossman/#threadhijack
>>>>
>>>> Thanks,
>>>> Shawn
>>>>
>>>>
>

Re: Deleting Fields

Posted by Joseph Obernberger <jo...@lovehorsepower.com>.
Thank you Erick.  I was thinking that it actually went through and 
removed the index data; that you for the clarification.  What happened 
was I had some bad data that created a lot of fields (some 8000).  I was 
getting some errors adding new fields where solr could not talk to 
zookeeper, and I thought it may be because there are so many fields.  
The index size is some 420million docs.
I'm hesitant to try to re-create as when the shards crash, they leave a 
write.lock file in HDFS, and I need to manually delete that file (on 27 
machines) before bringing them back up.
I believe this is the stack trace - but this looks to be related to 
facets, and I'm not 100% sure that this is the correct trace!  Sorry - I 
if it happens again I will update.

ERROR - 2015-05-29 20:39:34.707; [UNCLASS shard9 core_node14 UNCLASS] 
org.apache.solr.common.SolrException; null:java.lang.RuntimeException: 
java.lang.OutOfMemoryError: unable to create new native thread
         at 
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
         at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
         at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
         at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
         at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
         at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
         at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
         at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
         at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
         at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
         at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
         at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
         at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
         at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
         at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
         at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
         at org.eclipse.jetty.server.Server.handle(Server.java:368)
         at 
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
         at 
org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
         at 
org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
         at 
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
         at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
         at 
org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
         at 
org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
         at 
org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
         at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
         at 
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
         at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
         at java.lang.Thread.start0(Native Method)
         at java.lang.Thread.start(Thread.java:714)
         at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949)
         at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1371)
         at 
org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:637)
         at 
org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:280)
         at 
org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:106)
         at 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:222)
         at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
         at org.apache.solr.core.SolrCore.execute(SolrCore.java:1984)
         at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:829)
         at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:446)
         ... 26 more

Then later:

ERROR - 2015-05-29 21:57:22.370; [UNCLASS shard9 core_node14 UNCLASS] 
org.apache.solr.common.SolrException; null:java.lang.RuntimeException: 
java.lang.OutOfMemoryError: Java heap space
         at 
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:854)
         at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:463)
         at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
         at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
         at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
         at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
         at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
         at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
         at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
         at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
         at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
         at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
         at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
         at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
         at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
         at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
         at org.eclipse.jetty.server.Server.handle(Server.java:368)
         at 
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
         at 
org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
         at 
org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
         at 
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
         at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
         at 
org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
         at 
org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
         at 
org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
         at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
         at 
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
         at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.OutOfMemoryError: Java heap space



-Joe

On 5/30/2015 12:32 AM, Erick Erickson wrote:
> Yes, but deleting fields from the schema only means that _future_
> documents will throw an "undefined field" error. All the documents
> currently in the index will retain that field.
>
> Why you're hitting an OOM is a mystery though. But delete field isn't
> removing the contents if indexed documents. Showing us the full stack
> when you hit an OOM would be helpful.
>
> Best,
> Erick
>
> On Fri, May 29, 2015 at 4:58 PM, Joseph Obernberger
> <jo...@lovehorsepower.com> wrote:
>> Thank you Shawn - I'm referring to fields in the schema.  With Solr 5, you
>> can delete fields from the schema.
>> https://cwiki.apache.org/confluence/display/solr/Schema+API#SchemaAPI-DeleteaField
>>
>> -Joe
>>
>>
>> On 5/29/2015 7:30 PM, Shawn Heisey wrote:
>>> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
>>>> Hi All - I have a lot of fields to delete, but noticed that once I
>>>> started deleting them, I quickly ran out of heap space.  Is
>>>> delete-field a memory intensive operation?  Should I delete one field,
>>>> wait a while, then delete the next?
>>> I'm not aware of a way to delete a field.  I may have a different
>>> definition of what a field is than you do, though.
>>>
>>> Solr lets you delete entire documents, but deleting a field from the
>>> entire index would involve re-indexing every document in the index,
>>> excluding that field.
>>>
>>> Can you be more specific about exactly what you are doing, what you are
>>> seeing, and what you want to see instead?
>>>
>>> Also, please be aware of this:
>>>
>>> http://people.apache.org/~hossman/#threadhijack
>>>
>>> Thanks,
>>> Shawn
>>>
>>>


Re: Deleting Fields

Posted by Erick Erickson <er...@gmail.com>.
Yes, but deleting fields from the schema only means that _future_
documents will throw an "undefined field" error. All the documents
currently in the index will retain that field.

Why you're hitting an OOM is a mystery though. But delete field isn't
removing the contents if indexed documents. Showing us the full stack
when you hit an OOM would be helpful.

Best,
Erick

On Fri, May 29, 2015 at 4:58 PM, Joseph Obernberger
<jo...@lovehorsepower.com> wrote:
> Thank you Shawn - I'm referring to fields in the schema.  With Solr 5, you
> can delete fields from the schema.
> https://cwiki.apache.org/confluence/display/solr/Schema+API#SchemaAPI-DeleteaField
>
> -Joe
>
>
> On 5/29/2015 7:30 PM, Shawn Heisey wrote:
>>
>> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
>>>
>>> Hi All - I have a lot of fields to delete, but noticed that once I
>>> started deleting them, I quickly ran out of heap space.  Is
>>> delete-field a memory intensive operation?  Should I delete one field,
>>> wait a while, then delete the next?
>>
>> I'm not aware of a way to delete a field.  I may have a different
>> definition of what a field is than you do, though.
>>
>> Solr lets you delete entire documents, but deleting a field from the
>> entire index would involve re-indexing every document in the index,
>> excluding that field.
>>
>> Can you be more specific about exactly what you are doing, what you are
>> seeing, and what you want to see instead?
>>
>> Also, please be aware of this:
>>
>> http://people.apache.org/~hossman/#threadhijack
>>
>> Thanks,
>> Shawn
>>
>>
>

Re: Deleting Fields

Posted by Joseph Obernberger <jo...@lovehorsepower.com>.
Thank you Shawn - I'm referring to fields in the schema.  With Solr 5, 
you can delete fields from the schema.
https://cwiki.apache.org/confluence/display/solr/Schema+API#SchemaAPI-DeleteaField

-Joe

On 5/29/2015 7:30 PM, Shawn Heisey wrote:
> On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
>> Hi All - I have a lot of fields to delete, but noticed that once I
>> started deleting them, I quickly ran out of heap space.  Is
>> delete-field a memory intensive operation?  Should I delete one field,
>> wait a while, then delete the next?
> I'm not aware of a way to delete a field.  I may have a different
> definition of what a field is than you do, though.
>
> Solr lets you delete entire documents, but deleting a field from the
> entire index would involve re-indexing every document in the index,
> excluding that field.
>
> Can you be more specific about exactly what you are doing, what you are
> seeing, and what you want to see instead?
>
> Also, please be aware of this:
>
> http://people.apache.org/~hossman/#threadhijack
>
> Thanks,
> Shawn
>
>


Re: Deleting Fields

Posted by Shawn Heisey <ap...@elyograg.org>.
On 5/29/2015 5:08 PM, Joseph Obernberger wrote:
> Hi All - I have a lot of fields to delete, but noticed that once I
> started deleting them, I quickly ran out of heap space.  Is
> delete-field a memory intensive operation?  Should I delete one field,
> wait a while, then delete the next?

I'm not aware of a way to delete a field.  I may have a different
definition of what a field is than you do, though.

Solr lets you delete entire documents, but deleting a field from the
entire index would involve re-indexing every document in the index,
excluding that field.

Can you be more specific about exactly what you are doing, what you are
seeing, and what you want to see instead?

Also, please be aware of this:

http://people.apache.org/~hossman/#threadhijack

Thanks,
Shawn


Deleting Fields

Posted by Joseph Obernberger <jo...@lovehorsepower.com>.
Hi All - I have a lot of fields to delete, but noticed that once I 
started deleting them, I quickly ran out of heap space.  Is delete-field 
a memory intensive operation?  Should I delete one field, wait a while, 
then delete the next?
Thank you!

-Joe