You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Steinar Asbjørnsen <st...@gmail.com> on 2009/10/01 16:23:08 UTC

Re: "Only one usage of each socket address" error

Hi.

This situation is still bugging me.
I thought i had it fixed yday, but no...

Seems like this goes both for deleting and adding, but I'll explain  
the delete-situation here:
When I'm deleting documents(~5k) from a index, i get a error message  
saying
"Only one usage of each socket address (protocol/network address/port)  
is normally permitted 127.0.0.1:8983".

I've tried  both delete by id and delete by query, and both gives me  
the same error.
The command that is giving me the errormessage is solr.Delete(id) and  
solr.Delete(new SolrQuery("id:"+id)).

The command is issued with SolrNet, and I'm not sure if this is  
SolrNet or solr related.

I cannot find anything that helps me out in the catalina-log.
Are there any other logs that should be checked?

I'm grateful for any pointers :)

Thanks,
Steinar

Den 29. sep. 2009 kl. 11.15 skrev Steinar Asbjørnsen:

> Seems like the post in the SolrNet group: 		http://groups.google.com/group/solrnet/browse_thread/thread/7e3034b626d3e82d?pli=1 
>  helped me get trough.
>
> Thanks you solr-user's for helping out too!
>
> Steinar
>
> Videresendt melding:
>
>> Fra: Steinar Asbjørnsen <st...@gmail.com>
>> Dato: 28. september 2009 17.07.15 GMT+02.00
>> Til: solr-user@lucene.apache.org
>> Emne: Re: "Only one usage of each socket address" error
>>
>> I'm using the add(MyObject) command form ()in a foreach loop to add  
>> my objects to the index.
>>
>> In the catalina-log i cannot see anything that helps me out.
>> It stops at:
>> 28.sep.2009 08:58:40  
>> org.apache.solr.update.processor.LogUpdateProcessor finish
>> INFO: {add=[12345]} 0 187
>> 28.sep.2009 08:58:40 org.apache.solr.core.SolrCore execute
>> INFO: [core2] webapp=/solr path=/update params={} status=0 QTime=187
>> Whitch indicates nothing wrong.
>>
>> Are there any other logs that should be checked?
>>
>> What it seems like to me at the moment is that the foreach is  
>> passing objects(documents) to solr faster then solr can add them to  
>> the index. As in I'm eventually running out of connections (to  
>> solr?) or something.
>>
>> I'm running another incremental update that with other objects  
>> where the foreachs isn't quite as fast. This job has added over  
>> 100k documents without failing, and still going. Whereas the  
>> problematic job fails after ~3k.
>>
>> What I've learned trough the day tho, is that the index where my  
>> feed is failing is actually redundant.
>> I.e I'm off the hook for now.
>>
>> Still I'd like to figure out whats going wrong.
>>
>> Steinar
>>
>>> There's nothing in that output that indicates something we can  
>>> help with over in solr-user land.  What is the call you're making  
>>> to Solr?  Did Solr log anything anomalous?
>>>
>>> 	Erik
>>>
>>>
>>> On Sep 28, 2009, at 4:41 AM, Steinar Asbjørnsen wrote:
>>>
>>>> I just posted to the SolrNet-group since i have the exact same(?)  
>>>> problem.
>>>> Hope I'm not beeing rude posting here as well (since the SolrNet- 
>>>> group doesn't seem as active as this mailinglist).
>>>>
>>>> The problem occurs when I'm running an incremental feed(self  
>>>> made) of a index.
>>>>
>>>> My post:
>>>> [snip]
>>>> Whats happening is that i get this error message (in VS):
>>>> "A first chance exception of type
>>>> 'SolrNet.Exceptions.SolrConnectionException' occurred in  
>>>> SolrNet.DLL"
>>>> And the web browser (which i use to start the feed says:
>>>> "System.Data.SqlClient.SqlException: Timeout expired.  The timeout
>>>> period elapsed prior to completion of the operation or the server  
>>>> is
>>>> not responding."
>>>> At the time of writing my index contains 15k docs, and "lacks"  
>>>> ~700k
>>>> docs that the incremental feed should take care of adding to the
>>>> index.
>>>> The error message appears after 3k docs are added, and before 4k
>>>> docs are added.
>>>> I'm committing each 1%1000==0.
>>>> In addittion autocommit is set to:
>>>> <autoCommit>
>>>> <maxDocs>10000</maxDocs>
>>>> </autoCommit>
>>>> More info:
>>>> From schema.xml:
>>>> <field name="id" type="text" indexed="true" stored="true"
>>>> required="true" />
>>>> <field name="name" type="string" indexed="true" stored="true"
>>>> required="false" />
>>>> I'm fetching data from a (remote) Sql 2008 Server, using  
>>>> sqljdbc4.jar.
>>>> And Solr is running on a local Tomcat-installation.
>>>> SolrNet version: 0.2.3.0
>>>> Solr Specification Version: 1.3.0.2009.08.29.08.05.39
>>>>
>>>> [/snip]
>>>> Any suggestions on how to fix this would be much apreceiated.
>>>>
>>>> Regards,
>>>> Steinar
>>>
>>
>


Re: "Only one usage of each socket address" error

Posted by Steinar Asbjørnsen <st...@gmail.com>.
Ur the man Mauricio!

Adding and setting MaxUserPort and TCPTimedWaitDelay in the registry  
sure helps!
Over the wend I'll look into doing this programatically.

Thanks!
Steinar

Den 2. okt. 2009 kl. 14.47 skrev Mauricio Scheffer:

> Did you try this?
> http://blogs.msdn.com/dgorti/archive/2005/09/18/470766.aspx
> <http://blogs.msdn.com/dgorti/archive/2005/09/18/470766.aspx>Also,  
> please
> post the full exception stack trace.
>
> 2009/10/2 Steinar Asbjørnsen <st...@gmail.com>
>
>> Tried running solr on jetty now, and I still get the same error:(.
>>
>> Steinar
>>
>> Den 1. okt. 2009 kl. 16.23 skrev Steinar Asbjørnsen:
>>
>>
>> Hi.
>>>
>>> This situation is still bugging me.
>>> I thought i had it fixed yday, but no...
>>>
>>> Seems like this goes both for deleting and adding, but I'll  
>>> explain the
>>> delete-situation here:
>>> When I'm deleting documents(~5k) from a index, i get a error message
>>> saying
>>> "Only one usage of each socket address (protocol/network address/ 
>>> port) is
>>> normally permitted 127.0.0.1:8983".
>>>
>>> I've tried  both delete by id and delete by query, and both gives  
>>> me the
>>> same error.
>>> The command that is giving me the errormessage is solr.Delete(id)  
>>> and
>>> solr.Delete(new SolrQuery("id:"+id)).
>>>
>>> The command is issued with SolrNet, and I'm not sure if this is  
>>> SolrNet or
>>> solr related.
>>>
>>> I cannot find anything that helps me out in the catalina-log.
>>> Are there any other logs that should be checked?
>>>
>>> I'm grateful for any pointers :)
>>>
>>> Thanks,
>>> Steinar
>>>
>>> Den 29. sep. 2009 kl. 11.15 skrev Steinar Asbjørnsen:
>>>
>>> Seems like the post in the SolrNet group:
>>>> http://groups.google.com/group/solrnet/browse_thread/thread/7e3034b626d3e82d?pli=1 
>>>>  helped
>>>> me get trough.
>>>>
>>>> Thanks you solr-user's for helping out too!
>>>>
>>>> Steinar
>>>>
>>>> Videresendt melding:
>>>>
>>>> Fra: Steinar Asbjørnsen <st...@gmail.com>
>>>>> Dato: 28. september 2009 17.07.15 GMT+02.00
>>>>> Til: solr-user@lucene.apache.org
>>>>> Emne: Re: "Only one usage of each socket address" error
>>>>>
>>>>> I'm using the add(MyObject) command form ()in a foreach loop to  
>>>>> add my
>>>>> objects to the index.
>>>>>
>>>>> In the catalina-log i cannot see anything that helps me out.
>>>>> It stops at:
>>>>> 28.sep.2009 08:58:40  
>>>>> org.apache.solr.update.processor.LogUpdateProcessor
>>>>> finish
>>>>> INFO: {add=[12345]} 0 187
>>>>> 28.sep.2009 08:58:40 org.apache.solr.core.SolrCore execute
>>>>> INFO: [core2] webapp=/solr path=/update params={} status=0  
>>>>> QTime=187
>>>>> Whitch indicates nothing wrong.
>>>>>
>>>>> Are there any other logs that should be checked?
>>>>>
>>>>> What it seems like to me at the moment is that the foreach is  
>>>>> passing
>>>>> objects(documents) to solr faster then solr can add them to the  
>>>>> index. As in
>>>>> I'm eventually running out of connections (to solr?) or something.
>>>>>
>>>>> I'm running another incremental update that with other objects  
>>>>> where the
>>>>> foreachs isn't quite as fast. This job has added over 100k  
>>>>> documents without
>>>>> failing, and still going. Whereas the problematic job fails  
>>>>> after ~3k.
>>>>>
>>>>> What I've learned trough the day tho, is that the index where my  
>>>>> feed is
>>>>> failing is actually redundant.
>>>>> I.e I'm off the hook for now.
>>>>>
>>>>> Still I'd like to figure out whats going wrong.
>>>>>
>>>>> Steinar
>>>>>
>>>>> There's nothing in that output that indicates something we can  
>>>>> help
>>>>>> with over in solr-user land.  What is the call you're making to  
>>>>>> Solr?  Did
>>>>>> Solr log anything anomalous?
>>>>>>
>>>>>>       Erik
>>>>>>
>>>>>>
>>>>>> On Sep 28, 2009, at 4:41 AM, Steinar Asbjørnsen wrote:
>>>>>>
>>>>>> I just posted to the SolrNet-group since i have the exact same(?)
>>>>>>> problem.
>>>>>>> Hope I'm not beeing rude posting here as well (since the  
>>>>>>> SolrNet-group
>>>>>>> doesn't seem as active as this mailinglist).
>>>>>>>
>>>>>>> The problem occurs when I'm running an incremental feed(self  
>>>>>>> made) of
>>>>>>> a index.
>>>>>>>
>>>>>>> My post:
>>>>>>> [snip]
>>>>>>> Whats happening is that i get this error message (in VS):
>>>>>>> "A first chance exception of type
>>>>>>> 'SolrNet.Exceptions.SolrConnectionException' occurred in  
>>>>>>> SolrNet.DLL"
>>>>>>> And the web browser (which i use to start the feed says:
>>>>>>> "System.Data.SqlClient.SqlException: Timeout expired.  The  
>>>>>>> timeout
>>>>>>> period elapsed prior to completion of the operation or the  
>>>>>>> server is
>>>>>>> not responding."
>>>>>>> At the time of writing my index contains 15k docs, and "lacks"  
>>>>>>> ~700k
>>>>>>> docs that the incremental feed should take care of adding to the
>>>>>>> index.
>>>>>>> The error message appears after 3k docs are added, and before 4k
>>>>>>> docs are added.
>>>>>>> I'm committing each 1%1000==0.
>>>>>>> In addittion autocommit is set to:
>>>>>>> <autoCommit>
>>>>>>> <maxDocs>10000</maxDocs>
>>>>>>> </autoCommit>
>>>>>>> More info:
>>>>>>> From schema.xml:
>>>>>>> <field name="id" type="text" indexed="true" stored="true"
>>>>>>> required="true" />
>>>>>>> <field name="name" type="string" indexed="true" stored="true"
>>>>>>> required="false" />
>>>>>>> I'm fetching data from a (remote) Sql 2008 Server, using  
>>>>>>> sqljdbc4.jar.
>>>>>>> And Solr is running on a local Tomcat-installation.
>>>>>>> SolrNet version: 0.2.3.0
>>>>>>> Solr Specification Version: 1.3.0.2009.08.29.08.05.39
>>>>>>>
>>>>>>> [/snip]
>>>>>>> Any suggestions on how to fix this would be much apreceiated.
>>>>>>>
>>>>>>> Regards,
>>>>>>> Steinar
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>


Re: "Only one usage of each socket address" error

Posted by Mauricio Scheffer <ma...@gmail.com>.
Did you try this?
http://blogs.msdn.com/dgorti/archive/2005/09/18/470766.aspx
<http://blogs.msdn.com/dgorti/archive/2005/09/18/470766.aspx>Also, please
post the full exception stack trace.

2009/10/2 Steinar Asbjørnsen <st...@gmail.com>

> Tried running solr on jetty now, and I still get the same error:(.
>
> Steinar
>
> Den 1. okt. 2009 kl. 16.23 skrev Steinar Asbjørnsen:
>
>
>  Hi.
>>
>> This situation is still bugging me.
>> I thought i had it fixed yday, but no...
>>
>> Seems like this goes both for deleting and adding, but I'll explain the
>> delete-situation here:
>> When I'm deleting documents(~5k) from a index, i get a error message
>> saying
>> "Only one usage of each socket address (protocol/network address/port) is
>> normally permitted 127.0.0.1:8983".
>>
>> I've tried  both delete by id and delete by query, and both gives me the
>> same error.
>> The command that is giving me the errormessage is solr.Delete(id) and
>> solr.Delete(new SolrQuery("id:"+id)).
>>
>> The command is issued with SolrNet, and I'm not sure if this is SolrNet or
>> solr related.
>>
>> I cannot find anything that helps me out in the catalina-log.
>> Are there any other logs that should be checked?
>>
>> I'm grateful for any pointers :)
>>
>> Thanks,
>> Steinar
>>
>> Den 29. sep. 2009 kl. 11.15 skrev Steinar Asbjørnsen:
>>
>>  Seems like the post in the SolrNet group:
>>> http://groups.google.com/group/solrnet/browse_thread/thread/7e3034b626d3e82d?pli=1 helped
>>> me get trough.
>>>
>>> Thanks you solr-user's for helping out too!
>>>
>>> Steinar
>>>
>>> Videresendt melding:
>>>
>>>  Fra: Steinar Asbjørnsen <st...@gmail.com>
>>>> Dato: 28. september 2009 17.07.15 GMT+02.00
>>>> Til: solr-user@lucene.apache.org
>>>> Emne: Re: "Only one usage of each socket address" error
>>>>
>>>> I'm using the add(MyObject) command form ()in a foreach loop to add my
>>>> objects to the index.
>>>>
>>>> In the catalina-log i cannot see anything that helps me out.
>>>> It stops at:
>>>> 28.sep.2009 08:58:40 org.apache.solr.update.processor.LogUpdateProcessor
>>>> finish
>>>> INFO: {add=[12345]} 0 187
>>>> 28.sep.2009 08:58:40 org.apache.solr.core.SolrCore execute
>>>> INFO: [core2] webapp=/solr path=/update params={} status=0 QTime=187
>>>> Whitch indicates nothing wrong.
>>>>
>>>> Are there any other logs that should be checked?
>>>>
>>>> What it seems like to me at the moment is that the foreach is passing
>>>> objects(documents) to solr faster then solr can add them to the index. As in
>>>> I'm eventually running out of connections (to solr?) or something.
>>>>
>>>> I'm running another incremental update that with other objects where the
>>>> foreachs isn't quite as fast. This job has added over 100k documents without
>>>> failing, and still going. Whereas the problematic job fails after ~3k.
>>>>
>>>> What I've learned trough the day tho, is that the index where my feed is
>>>> failing is actually redundant.
>>>> I.e I'm off the hook for now.
>>>>
>>>> Still I'd like to figure out whats going wrong.
>>>>
>>>> Steinar
>>>>
>>>>  There's nothing in that output that indicates something we can help
>>>>> with over in solr-user land.  What is the call you're making to Solr?  Did
>>>>> Solr log anything anomalous?
>>>>>
>>>>>        Erik
>>>>>
>>>>>
>>>>> On Sep 28, 2009, at 4:41 AM, Steinar Asbjørnsen wrote:
>>>>>
>>>>>  I just posted to the SolrNet-group since i have the exact same(?)
>>>>>> problem.
>>>>>> Hope I'm not beeing rude posting here as well (since the SolrNet-group
>>>>>> doesn't seem as active as this mailinglist).
>>>>>>
>>>>>> The problem occurs when I'm running an incremental feed(self made) of
>>>>>> a index.
>>>>>>
>>>>>> My post:
>>>>>> [snip]
>>>>>> Whats happening is that i get this error message (in VS):
>>>>>> "A first chance exception of type
>>>>>> 'SolrNet.Exceptions.SolrConnectionException' occurred in SolrNet.DLL"
>>>>>> And the web browser (which i use to start the feed says:
>>>>>> "System.Data.SqlClient.SqlException: Timeout expired.  The timeout
>>>>>> period elapsed prior to completion of the operation or the server is
>>>>>> not responding."
>>>>>> At the time of writing my index contains 15k docs, and "lacks" ~700k
>>>>>> docs that the incremental feed should take care of adding to the
>>>>>> index.
>>>>>> The error message appears after 3k docs are added, and before 4k
>>>>>> docs are added.
>>>>>> I'm committing each 1%1000==0.
>>>>>> In addittion autocommit is set to:
>>>>>> <autoCommit>
>>>>>> <maxDocs>10000</maxDocs>
>>>>>> </autoCommit>
>>>>>> More info:
>>>>>> From schema.xml:
>>>>>> <field name="id" type="text" indexed="true" stored="true"
>>>>>> required="true" />
>>>>>> <field name="name" type="string" indexed="true" stored="true"
>>>>>> required="false" />
>>>>>> I'm fetching data from a (remote) Sql 2008 Server, using sqljdbc4.jar.
>>>>>> And Solr is running on a local Tomcat-installation.
>>>>>> SolrNet version: 0.2.3.0
>>>>>> Solr Specification Version: 1.3.0.2009.08.29.08.05.39
>>>>>>
>>>>>> [/snip]
>>>>>> Any suggestions on how to fix this would be much apreceiated.
>>>>>>
>>>>>> Regards,
>>>>>> Steinar
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: "Only one usage of each socket address" error

Posted by Steinar Asbjørnsen <st...@gmail.com>.
Tried running solr on jetty now, and I still get the same error:(.

Steinar

Den 1. okt. 2009 kl. 16.23 skrev Steinar Asbjørnsen:

> Hi.
>
> This situation is still bugging me.
> I thought i had it fixed yday, but no...
>
> Seems like this goes both for deleting and adding, but I'll explain  
> the delete-situation here:
> When I'm deleting documents(~5k) from a index, i get a error message  
> saying
> "Only one usage of each socket address (protocol/network address/ 
> port) is normally permitted 127.0.0.1:8983".
>
> I've tried  both delete by id and delete by query, and both gives me  
> the same error.
> The command that is giving me the errormessage is solr.Delete(id)  
> and solr.Delete(new SolrQuery("id:"+id)).
>
> The command is issued with SolrNet, and I'm not sure if this is  
> SolrNet or solr related.
>
> I cannot find anything that helps me out in the catalina-log.
> Are there any other logs that should be checked?
>
> I'm grateful for any pointers :)
>
> Thanks,
> Steinar
>
> Den 29. sep. 2009 kl. 11.15 skrev Steinar Asbjørnsen:
>
>> Seems like the post in the SolrNet group: 		http://groups.google.com/group/solrnet/browse_thread/thread/7e3034b626d3e82d?pli=1 
>>  helped me get trough.
>>
>> Thanks you solr-user's for helping out too!
>>
>> Steinar
>>
>> Videresendt melding:
>>
>>> Fra: Steinar Asbjørnsen <st...@gmail.com>
>>> Dato: 28. september 2009 17.07.15 GMT+02.00
>>> Til: solr-user@lucene.apache.org
>>> Emne: Re: "Only one usage of each socket address" error
>>>
>>> I'm using the add(MyObject) command form ()in a foreach loop to  
>>> add my objects to the index.
>>>
>>> In the catalina-log i cannot see anything that helps me out.
>>> It stops at:
>>> 28.sep.2009 08:58:40  
>>> org.apache.solr.update.processor.LogUpdateProcessor finish
>>> INFO: {add=[12345]} 0 187
>>> 28.sep.2009 08:58:40 org.apache.solr.core.SolrCore execute
>>> INFO: [core2] webapp=/solr path=/update params={} status=0 QTime=187
>>> Whitch indicates nothing wrong.
>>>
>>> Are there any other logs that should be checked?
>>>
>>> What it seems like to me at the moment is that the foreach is  
>>> passing objects(documents) to solr faster then solr can add them  
>>> to the index. As in I'm eventually running out of connections (to  
>>> solr?) or something.
>>>
>>> I'm running another incremental update that with other objects  
>>> where the foreachs isn't quite as fast. This job has added over  
>>> 100k documents without failing, and still going. Whereas the  
>>> problematic job fails after ~3k.
>>>
>>> What I've learned trough the day tho, is that the index where my  
>>> feed is failing is actually redundant.
>>> I.e I'm off the hook for now.
>>>
>>> Still I'd like to figure out whats going wrong.
>>>
>>> Steinar
>>>
>>>> There's nothing in that output that indicates something we can  
>>>> help with over in solr-user land.  What is the call you're making  
>>>> to Solr?  Did Solr log anything anomalous?
>>>>
>>>> 	Erik
>>>>
>>>>
>>>> On Sep 28, 2009, at 4:41 AM, Steinar Asbjørnsen wrote:
>>>>
>>>>> I just posted to the SolrNet-group since i have the exact same 
>>>>> (?) problem.
>>>>> Hope I'm not beeing rude posting here as well (since the SolrNet- 
>>>>> group doesn't seem as active as this mailinglist).
>>>>>
>>>>> The problem occurs when I'm running an incremental feed(self  
>>>>> made) of a index.
>>>>>
>>>>> My post:
>>>>> [snip]
>>>>> Whats happening is that i get this error message (in VS):
>>>>> "A first chance exception of type
>>>>> 'SolrNet.Exceptions.SolrConnectionException' occurred in  
>>>>> SolrNet.DLL"
>>>>> And the web browser (which i use to start the feed says:
>>>>> "System.Data.SqlClient.SqlException: Timeout expired.  The timeout
>>>>> period elapsed prior to completion of the operation or the  
>>>>> server is
>>>>> not responding."
>>>>> At the time of writing my index contains 15k docs, and "lacks"  
>>>>> ~700k
>>>>> docs that the incremental feed should take care of adding to the
>>>>> index.
>>>>> The error message appears after 3k docs are added, and before 4k
>>>>> docs are added.
>>>>> I'm committing each 1%1000==0.
>>>>> In addittion autocommit is set to:
>>>>> <autoCommit>
>>>>> <maxDocs>10000</maxDocs>
>>>>> </autoCommit>
>>>>> More info:
>>>>> From schema.xml:
>>>>> <field name="id" type="text" indexed="true" stored="true"
>>>>> required="true" />
>>>>> <field name="name" type="string" indexed="true" stored="true"
>>>>> required="false" />
>>>>> I'm fetching data from a (remote) Sql 2008 Server, using  
>>>>> sqljdbc4.jar.
>>>>> And Solr is running on a local Tomcat-installation.
>>>>> SolrNet version: 0.2.3.0
>>>>> Solr Specification Version: 1.3.0.2009.08.29.08.05.39
>>>>>
>>>>> [/snip]
>>>>> Any suggestions on how to fix this would be much apreceiated.
>>>>>
>>>>> Regards,
>>>>> Steinar
>>>>
>>>
>>
>