You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@activemq.apache.org by ocoolio <ad...@freemail.hu> on 2007/07/25 16:04:17 UTC

Performance problem - On empty queue sender gets blocked - on filled queue it's OK

Hello!

I have a bad performance problem:

Im using NMS with .NET (C#) and two Qs:
SUBMITQ
DESTQ

A server component listens on SUBMITQ. When message received, transforms it
and submits it to DESTQ. All done in one transaction
(AcknowledgementMode.Transactional and sess.Commit(), PrefetchSize=1).
This works great with stuffed queues (>10 msg pending) since the receiver
doenst block the producer.
The performance in this (filled q) case: ~130msg/sec on consumer.
The producer can send about 500msg/sec.

The problem comes when the consumer empties the q. When it happens the
producer becomes extremely slow (10msg/s) as well as the consumer. The q
remains empty so somehow the prod and cons waits for each other.
When I pause the consumer for a sec, the q gets a lot of messages and the
performance is back to normal: the producer sends with 500, the consumer
works with 130.

I use persistent messages, journal and PostgreSQL.
Because of .NET NMS, I cannot make async sends on the client, however the
500/sec is just enough for me.

Please, tell me what this is, I just cannot figure it out. I dont want to
implement wait cycles on the consumer just to get the queue packed with
messages!
(ActiveMQ 5.0, 2007 07 10).

Thanks,

Adam
-- 
View this message in context: http://www.nabble.com/Performance-problem---On-empty-queue-sender-gets-blocked---on-filled-queue-it%27s-OK-tf4142407s2354.html#a11783316
Sent from the ActiveMQ - User mailing list archive at Nabble.com.


SOLUTION Re: Performance problem - On empty queue sender gets blocked - on filled queue it's OK

Posted by ocoolio <ad...@freemail.hu>.

Hi everyone!

I've found out that it's a postgres concurrency bug: postgres is just bad at
concurrent transactions updating the same row/page. Switching to DB2 Express
solved it.

Cheers.


ocoolio wrote:
> 
> Hello!
> 
> I have a bad performance problem:
> 
> Im using NMS with .NET (C#) and two Qs:
> SUBMITQ
> DESTQ
> 
> A server component listens on SUBMITQ. When message received, transforms
> it and submits it to DESTQ. All done in one transaction
> (AcknowledgementMode.Transactional and sess.Commit(), PrefetchSize=1).
> This works great with stuffed queues (>10 msg pending) since the receiver
> doenst block the producer.
> The performance in this (filled q) case: ~130msg/sec on consumer.
> The producer can send about 500msg/sec.
> 
> The problem comes when the consumer empties the q. When it happens the
> producer becomes extremely slow (10msg/s) as well as the consumer. The q
> remains empty so somehow the prod and cons waits for each other.
> When I pause the consumer for a sec, the q gets a lot of messages and the
> performance is back to normal: the producer sends with 500, the consumer
> works with 130.
> 
> I use persistent messages, journal and PostgreSQL.
> Because of .NET NMS, I cannot make async sends on the client, however the
> 500/sec is just enough for me.
> 
> Please, tell me what this is, I just cannot figure it out. I dont want to
> implement wait cycles on the consumer just to get the queue packed with
> messages!
> (ActiveMQ 5.0, 2007 07 10).
> 
> Thanks,
> 
> Adam
> 

-- 
View this message in context: http://www.nabble.com/Performance-problem---On-empty-queue-sender-gets-blocked---on-filled-queue-it%27s-OK-tf4142407s2354.html#a11941918
Sent from the ActiveMQ - User mailing list archive at Nabble.com.


Re: Performance problem - On empty queue sender gets blocked - on filled queue it's OK

Posted by ocoolio <ad...@freemail.hu>.
Hey!

Well, okay, I see no one likes my questions, so I keep myself busy:
I've found out that it is basically a problem with Postgres! As soon as the
transaction is commited the ActiveMQ, Broker starts to do INSERTs on
Postgres wich takes forever. I think it might be a journal bug, since as
I've read, the journal should start committing the changes way after the
messages was received.
But back to postgres: I think commiting to the same table (producer) and
updating the same table (consumer) is just a very hard task for Postgres,
and concurrency kills performance.
I've tested with MSSQL 2005, however I receive strange stackoverflow
exception from the broker after ~1000 messages. However I think there were
no problems with performance (the producer did not slow down)! I've tried
MySQL as well but that's just a piece of... The performance was awful, worse
than anything I've tried with ActiveMQ. An uninstallation solved it :)

So, still no reply? :)

Thanks,

Adam


ocoolio wrote:
> 
> Hello!
> 
> Just a short extra info:
> the producer's first transaction is fast as usually! Then after commit, in
> the second transaction the .Send() takes forever.
> No matter if I create a new factory, connection, whatever, the producer is
> blocked. However, I can start a new instance (new exe, new process I mean)
> which can commit fast once, then slows down just like the first process.
> What it has to do with processes?
> 
> What is happening? 
> 
> 
> Thanks,
> 
> Adam
> 
> 
> ocoolio wrote:
>> 
>> Hello!
>> 
>> I have a bad performance problem:
>> 
>> Im using NMS with .NET (C#) and two Qs:
>> SUBMITQ
>> DESTQ
>> 
>> A server component listens on SUBMITQ. When message received, transforms
>> it and submits it to DESTQ. All done in one transaction
>> (AcknowledgementMode.Transactional and sess.Commit(), PrefetchSize=1).
>> This works great with stuffed queues (>10 msg pending) since the receiver
>> doenst block the producer.
>> The performance in this (filled q) case: ~130msg/sec on consumer.
>> The producer can send about 500msg/sec.
>> 
>> The problem comes when the consumer empties the q. When it happens the
>> producer becomes extremely slow (10msg/s) as well as the consumer. The q
>> remains empty so somehow the prod and cons waits for each other.
>> When I pause the consumer for a sec, the q gets a lot of messages and the
>> performance is back to normal: the producer sends with 500, the consumer
>> works with 130.
>> 
>> I use persistent messages, journal and PostgreSQL.
>> Because of .NET NMS, I cannot make async sends on the client, however the
>> 500/sec is just enough for me.
>> 
>> Please, tell me what this is, I just cannot figure it out. I dont want to
>> implement wait cycles on the consumer just to get the queue packed with
>> messages!
>> (ActiveMQ 5.0, 2007 07 10).
>> 
>> Thanks,
>> 
>> Adam
>> 
> 
> 

-- 
View this message in context: http://www.nabble.com/Performance-problem---On-empty-queue-sender-gets-blocked---on-filled-queue-it%27s-OK-tf4142407s2354.html#a11832139
Sent from the ActiveMQ - User mailing list archive at Nabble.com.


Re: Performance problem - On empty queue sender gets blocked - on filled queue it's OK

Posted by ocoolio <ad...@freemail.hu>.
Hello!

Just a short extra info:
the producer's first transaction is fast as usually! Then after commit, in
the second transaction the .Send() takes forever.
No matter if I create a new factory, connection, whatever, the producer is
blocked. However, I can start a new instance (new exe, new process I mean)
which can commit fast once, then slows down just like the first process.
What it has to do with processes?

What is happening? 


Thanks,

Adam


ocoolio wrote:
> 
> Hello!
> 
> I have a bad performance problem:
> 
> Im using NMS with .NET (C#) and two Qs:
> SUBMITQ
> DESTQ
> 
> A server component listens on SUBMITQ. When message received, transforms
> it and submits it to DESTQ. All done in one transaction
> (AcknowledgementMode.Transactional and sess.Commit(), PrefetchSize=1).
> This works great with stuffed queues (>10 msg pending) since the receiver
> doenst block the producer.
> The performance in this (filled q) case: ~130msg/sec on consumer.
> The producer can send about 500msg/sec.
> 
> The problem comes when the consumer empties the q. When it happens the
> producer becomes extremely slow (10msg/s) as well as the consumer. The q
> remains empty so somehow the prod and cons waits for each other.
> When I pause the consumer for a sec, the q gets a lot of messages and the
> performance is back to normal: the producer sends with 500, the consumer
> works with 130.
> 
> I use persistent messages, journal and PostgreSQL.
> Because of .NET NMS, I cannot make async sends on the client, however the
> 500/sec is just enough for me.
> 
> Please, tell me what this is, I just cannot figure it out. I dont want to
> implement wait cycles on the consumer just to get the queue packed with
> messages!
> (ActiveMQ 5.0, 2007 07 10).
> 
> Thanks,
> 
> Adam
> 

-- 
View this message in context: http://www.nabble.com/Performance-problem---On-empty-queue-sender-gets-blocked---on-filled-queue-it%27s-OK-tf4142407s2354.html#a11809820
Sent from the ActiveMQ - User mailing list archive at Nabble.com.