Execute Same workflow in Parallel

Hi, I am new to BPMN workflows and trying to understand how the process engine executes the same workflow(Same BPMN definition) in parallel.
Here is my use case:
We have a platform which publishes some data to BPMN Definition file by using variables, below is the line of snippet on how we send data to BPMN.

public String initiateProcess(WorkflowRequest workflowRequest)
 Map<String, Object> variables = new HashMap<>();
        variables.put("request", workflowRequest);
 ProcessInstance processInstance = processEngine.getRuntimeService().startProcessInstanceByKey(processId, variables);

Where processId is the BPMN definitionKey.
Now within BPMN workflow/definition file, i want to read these variables data and process it and then publish the processed data to some other external systems via REST APIs.

Here the problem is the method initiateProcess will be triggered very frequently for multiple times with in the application with different set of workflowRequest data. Which means we want to publish different set of data to the same external system via workflow for multiple times very frequently.

If i run above logic, sometimes i get variables value as null in BPMN service tasks, (even though the variable value is not null in the initiateProcess method) and some times we get an error saying the variable “request” already exist. I think parallel execution of same workflow/BPMN process definition is failing. Please help me out on how to execute the same workflow/BPMN process definition in parallel. I dont want to run service tasks in parallel, but wanted to run entire process definition (BPMN file) in parallel.

I am really looking for an answer on this, can anyone could reply to this question. Kindly let me know if you need any further details if the question is not clear. Thanks in advance.

Process definitions know nothing about one another when they are started. They are completely independent and so i don’t think the issue here has anything to do with the engine. Can you give more information about your setup. Camunda Version, Database, Clustered Setup?

Hi Niall, Thank you for responding to my request.

We are using Camunda BPMN Engine APIs, below is the artifacts details which we configured in our spring boot application.


Below is the configuration of datasource bean for camunda engine:

	public DataSource dataSource() {
		// data source for In-Memory H2 database.

		SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
		return dataSource;

	// @EventListener(ApplicationReadyEvent.class)
	public SpringProcessEngineConfiguration processEngineConfiguration() throws IOException {

		SpringProcessEngineConfiguration config = new SpringProcessEngineConfiguration();
		try {
			logger.info("processEngineConfiguration is loaded");
			Resource[] resources = resourceLoader.getResources("file:" + System.getenv("WORKFLOW_PATH") + "/*.bpmn");

		} catch (RuntimeException e) {
			return null;

		return config;

The above configuration loaded at the time of application startup by importing its class as


We don’t have any clustered kind of architecture, its a spring boot application.

We have service tasks in workflow with groovy scripts embedded init. Below is the sample snippet of one service task:

 <serviceTask id="ReadEvent" name="Read Event" activiti:class="com.org.WorkflowServiceExecutor">
        <activiti:field name="scriptExecutor">
        <activiti:field name="script">
               import java.util.HashMap;
		import java.util.Iterator;
		import java.util.Map;
          Calendar time = Calendar.getInstance(TimeZone.getTimeZone("GMT"));
					long timestamp = time.getTimeInMillis() / 1000;
					WorkflowRequest workflowRequest=(WorkflowRequest)execution.getVariable("request");
        <activiti:in sourceExpression="${Event.getEventType()}" target="eventType"></activiti:in>

Below is the service class which loads and execute the above groovy script.

public class WorkflowServiceExecutor implements JavaDelegate {
    public void execute(DelegateExecution execution) throws Exception {

        try {
 ScriptingEngines scriptingEngines = (ScriptingEngines) scriptExecutorExpression.getValue(execution);

            // scriptingEngines.setScriptBindingsFactory(scriptBindingsFactory);
            ScriptEngine scriptEngineForLanguage = scriptingEngines.getScriptEngineForLanguage("groovy");
            scriptEngineForLanguage.put("execution", execution);

        } catch (NullPointerException ne) {

        } catch (RuntimeException | ScriptException e) {
            logger.info("Process Engine exception in delegate");

What version of camunda are you using, also did you create the model using the camunda modeler?
How are you deploying the processes - do you simply have them in the resources directory?

Camunda Version: 7.10.0

Camunda Modeler: Eclipse Camunda Modeler Plugin(latest version fetched from http://camunda.org/release/camunda-eclipse-plugin/update-sites/kepler/latest/site/ )

In our case when ever a new BPMN file is uploaded to our spring boot application by the user, we store that bpmn files in an external directory and load them for deployment using below lines of code. So, its like deployment happens only once, i.e, at the time of file upload to our application.

  public void deployProcess(ProcessEngine processEngine, String processName, String fileLocation)
            throws FileNotFoundException, GlobalException {
        try {
            logger.info("Initiating workflow deployment: deployProcess");
            DeploymentBuilder deploymentBuilder = processEngine.getRepositoryService().createDeployment();
            Deployment deployment = deploymentBuilder.name(processName )
                    .addInputStream(processName + ".bpmn", new FileInputStream(new File(fileLocation)))
            deployementProcess.put(processName, deployment.getId());
        } catch (RuntimeException e) {

Firstly, you’re using a deprecated tool for modeling make sure your models work with the latest version of the stand alone modeler: https://camunda.com/download/modeler/

If you’re deploying this way then you should change




Otherwise you might have problems on restarting springboot.

There could be a lot of problems created from using the wrong modeler, so let me know if any issues arise from loading your modeling into the new modeler.

Ok, let me try the suggested things, and will update you soon. Thanks for helping me out.

Hi Niall, I did changed the modeler for modeling our workflow and observed the differences in xml namespaces and tags, below is the sample snippet of changes.

<bpmn:serviceTask id="ConsumerEvent" name="Consume Event" camunda:class="com.org.WorkflowServiceExecutor">
        <camunda:field name="scriptExecutor">
        <camunda:field name="script">
          <camunda:string>import java.util.HashMap;
					Calendar time = Calendar.getInstance(TimeZone.getTimeZone("GMT"));
					long timestamp = time.getTimeInMillis() / 1000;
					WorkflowRequest workflowRequest=(WorkflowRequest)execution.getVariable("request");
					Map<String, Object>eventMap =(Map<String, Object>) workflowRequest.getAttribute("event");
	execution.setVariable("alertEvent",eventMap );

And as you suggested, we also applied the change in


and restarted the application, but its still giving the same error.
The workflow works fine, if a single request is raised to that workflow. But if we trigger the same workflow for multiple times very fastly, then few requests getting processed and few requests throwing error about the variables as null or already exists.

javax.script.ScriptException: javax.script.ScriptException: org.camunda.bpm.engine.ProcessEngineException: ENGINE-17004 Cannot add variable instance with name alertEvent. Variable already exists
[ERROR] 2019-10-07 02:13:17.032 [Thread-1547] context - ENGINE-16006 BPMN Stack Trace:
        checkEventType (activity-leave, ProcessInstance[9052])
        checkEventType, name=checkEventType
        ReadEvent, name=Read Event

FYI, invocation of initiateProcess method will be invoked on a thread model as below:

public void publish(Event alertEvent) {
		new Thread(new Runnable()
            public void run()
 String instanceID =initiateProcess (workflowRequest);
}	}).start();

Kindly let me know if above logic could be any reason in breaking the workflow execution.

Any other suggestions to make this usecase possible ?

Hello Team,

I tried many other ways to see if this works, but no luck.
Looks like Camunda Engine can not execute the same workflow definition for multiple times very frequently.

Some times I also see below exceptions

Cause: org.apache.ibatis.executor.BatchExecutorException: org.camunda.bpm.engine.impl.persistence.entity.VariableInstanceEntity.insertVariableInstance (batch index #3) failed. 2 prior sub executor(s) completed successfully, but will be rolled back. Cause: org.h2.jdbc.JdbcBatchUpdateException: Referential integrity constraint violation: "ACT_FK_VAR_BYTEARRAY: PUBLIC.ACT_RU_VARIABLE FOREIGN KEY(BYTEARRAY_ID_) REFERENCES PUBLIC.ACT_GE_BYTEARRAY(ID_) ('45')"; SQL statement:

And some times i see the other exception also:

[ERROR] 2019-10-11 01:09:57.261 [Thread-18] context - ENGINE-16004 Exception while closing command context: ENGINE-02004 No outgoing sequence flow for the element with id 'checkEventType' could be selected for continuing the process.

checkEventType is conditional block, which will be having atleast a default value, so there is no chance of empty value for this conditional block, but the error says, no sequence flow!!

Above errors are all intermittent in behavior.

I’m very confused by the main issue you’re having - there really shouldn’t be any issue with starting multiple instance of the same process definition on a regularly configured system. So that might not be the underlying problem.

How many instance are you trying to start per second?
Can you please upload your model?

CamundaExtensionWorkflow.bpmn (14.2 KB)

Attached is the base workflow which we are trying to execute.

I feel like its creates atleast 3 instances per sec.

FYI, we are just using H2-InMemory database for camunda engine, hope we dont need to change this.

What are you using scripts as part of the field injection?

Im sorry Im not getting your question, but we are using the script field for having the business logic to implement, written in groovy, and we are able to load the values of that field in WorkflowServiceExecutor class (which i have already posted above ) and able to execute the groovy script.

<bpmn:serviceTask id="ReadEvent" name="Read Event" camunda:class="com.org.WorkflowServiceExecutor">
       <camunda:field name="scriptExecutor">
       <camunda:field name="script">
         <camunda:string>import java.util.HashMap;
   	import java.util.Iterator;
   	import java.util.Map;
   	import java.text.SimpleDateFormat;
   	import java.util.Date;
   	import java.io.*;
   	import java.io.FileInputStream;
   	import java.util.*;
   	import java.util.Scanner;
   	import org.camunda.bpm.engine.delegate.DelegateExecution;
   	import com.google.gson.Gson;
   	import java.lang.reflect.Type;
   	import com.google.gson.reflect.TypeToken;
   	import org.springframework.http.HttpEntity;
   	import org.springframework.http.HttpHeaders;
   	import org.springframework.http.HttpStatus;
   	import org.springframework.http.MediaType;
   	import org.springframework.http.ResponseEntity;
   	import org.springframework.http.converter.HttpMessageConverter;
   	import org.springframework.http.converter.json.MappingJackson2HttpMessageConverter;
   	import org.springframework.web.client.RestTemplate;
   	import java.util.HashMap;
   	import java.util.Map;
   	import org.slf4j.Logger;
   		import org.slf4j.LoggerFactory;

   			   Logger logger = LoggerFactory.getLogger("ExtensionWorkflow");
   				Calendar time = Calendar.getInstance(TimeZone.getTimeZone("GMT"));
   				long timestamp = time.getTimeInMillis() / 1000;
   				WorkflowRequest workflowRequest=(WorkflowRequest)execution.getVariable("request");
   				AlertEvent alertEvents =(AlertEvent) workflowRequest.getAttribute("alertEventRequest");
   				Gson gson=new Gson();
   				String requestJson=gson.toJson(alertEvents);
   				logger.info("Request  "+alertEvents.getEventType()+" "+alertEvents.getEventID());
   				}else if(alertEvents.getEventType().equalsIgnoreCase("DeleteBox")){
   				}else if(alertEvents.getEventType().equalsIgnoreCase("ConsumeData")){

This is what we placed in scriptfield

Using field injection is not the way to do this.
You should add it as an listener or as a script task. That could be causing you problems

JavaDelegate will execute the fields right ? I mean to say, like service task can internally use this interface and execute the scripts/expressions from the fields injected right ?

Well, I changed the workflow from using service task to script task, but its behaving the same manner,

This time i observed one process instance is getting executed repeatedly.

For Ex: if i invoke initiateProcess method with workflowRequest as input parameter by setting one of its field, lets say workflowRequest .eventId=400, now its expected that in the workflow, if i print this eventID, then 400 should appear for only one time. But here i see this line getting printed with same 400 ID for 3 to 4 times.

[INFO ] 2019-10-11 05:11:56.043 [Thread-1033] ExtensionWorkflow - Request Data ConsumeEvent **400** 
[INFO ] 2019-10-11 05:11:56.044 [Thread-1035] ExtensionWorkflow - Request Data ConsumeEvent **400** 
[INFO ] 2019-10-11 05:11:56.045 [Thread-1036] ExtensionWorkflow - Request Data ConsumeEvent **400** 

And it skips some other events which is being passed from our application to workflow, like i can see a log line printed with eventID as 401 in initiateProcess method, but the same doesn’t appear in workflow log lines.

Not only the input data, but also when i print the process instance id in workflow, it gets printed repeatedly with same processInstanceID. I think this shouldn’t be the case.


Please attach your current BPMN XML so we can better understand the scenario we are talking about. Even better if you can share a sample project on github that reproduces the problem.

In your original Java Delegate, you have this code:

ScriptEngine scriptEngineForLanguage = scriptingEngines.getScriptEngineForLanguage("groovy");
scriptEngineForLanguage.put("execution", execution);

This code is not thread-safe. The script engine is cached, so every execution of the service task uses the same script engine. You constantly overwrite a global variable of that scripting engine and then evaluate the actual script. It is likely that you experience race conditions.