You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by "Chaushu, Shani" <sh...@intel.com> on 2016/07/26 17:53:47 UTC

Solr is locked after killed - hdfs

Hi,
I have issue that I saw it's a bug that really didn't resolved:
https://issues.apache.org/jira/browse/SOLR-8335

I worked with solr 6.1, on HDFS but I see it's also exists in solr 5.X
I started my solr, and because of out of memory exception, it was killed.
Now when I start it again, I have errors of

org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Index dir 'hdfs://XXX/' of core yyy  is already locked. The most likely cause is another Solr server (or another solr core in this server) also configured to use this directory; other possible causes may be specific to lockType: hdfs

I saw in the index hdfs files, that the write.lock is exists even after killed.
I have multiple servers and multiple collection, and a lot of processes that work on solr, and when it killed I want it to be up without errors.

Is there any way to kill the solr and free the locks? Or any other kind of lock that will work with hdfs?

Thanks a lot,
Shani


---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.