Limit concurrent process when using REST connector

Hi all,
I’m currently using the REST connector to send messages to my services.
However, I’m concerned about overwhelming the services with a large number of requests at once. Is there a way to limit the number of concurrent requests sent out when using the REST connector?
Thanks in advance.

Hello!

Dude… I believe you can implement something like “queue” for your http requests… or check if your HTTP lib has some function to limit the number of open connections (pool of connections).

I know there are some libs that have this.

Hope this helps!

Regards.
William Robert Alves

First question I would ask is:
SaaS or Self-Hosted?

For SaaS, then I would think that it’s unlikely. You might be able to design a BPMN that implements a single-threaded queue (or N-threaded), but that would likely get expensive.

For Self-Hosted, I would say: “Possible, but not 100% sure on the how” since the connectors are a special subtype of service task. You can define the maximum number of jobs to pick up as 1, thus making it single-threaded.

Thank you, @WilliamR.Alves , for your suggestion.
I have a concern about applying limitations on the service side that may cause many error messages in Camunda. For example, if my service only allows for 10 concurrent requests, but there are 15 messages in Camunda, it means that 5 messages will receive errors due to timeout or rejection.

Regards,
Larry

You have to limit it on client side, not server side. In this case I use method of SemaphoreMap (nodejs package) with some number. Client sends N request, if 1 responded, next request go futher…
I dont know how to limit it with rest connector, because I use my own code as rest client.