Call activity Multi instance : invoking same delegate twice

I have a sub workflow async enabled on start event, once the instance started to process i see the very next delegate executing twice.

any config am i missing.

Thanks,
Gowtham

Can you upload the model, it’s hard to get a complete picture without being able to see the technical details of the model.

code:
populating list of sub workflows that need to be called
@Service
public class LocationProcessServiceDelegate implements JavaDelegate {

private static final Logger LOGGER = LoggerFactory.getLogger(LocationProcessServiceDelegate.class);

@Value("#{${dataSource}}")
private Map<String, String> dataSourceMap = new HashMap<>();

private static String application = "application:";

@Value("${abc.application}")
private String applicationName;

@Autowired
private LocationProcessVariableUtil locationProcessVariableUtil;
@Override
public void execute(DelegateExecution execution) throws Exception {

	LOGGER.info(application + applicationName + ", LocationProcessServiceDelegate  activityId: "
			+ execution.getCurrentActivityId() + ", activityName: " + execution.getCurrentActivityName()
			+ ", processInstanceId: " + execution.getProcessInstanceId());
	String workflowId = null;
	LocationResult locationResult = locationProcessVariableUtil.initialiseLocationResult(execution);
	workflowId = locationResult.getWorkflowId();
	locationResult.getRequest();
	List<String> dataSourceList = new ArrayList<>();

	if (CollectionUtils.isNotEmpty(locationResult.getRequest().getComponents())) {
		for (Component component : locationResult.getRequest().getComponents()) {
			if (dataSourceMap.containsKey(component.getType().toUpperCase())) {
				dataSourceList.add(dataSourceMap.get(component.getType().toUpperCase()));
			}

		}
	}

	LOGGER.debug(application + applicationName + ",LocationProcessServiceDelegate  processInstanceId:  "
			+ execution.getProcessInstanceId() + "workflowId:" + workflowId + ",dataSourceList:" + dataSourceList);

	execution.setVariable("dataSourceList", dataSourceList);

}

Can you upload the model itself - it contains a lot of important information about how the process should execute.

diagram_1.bpmn (9.4 KB)

so process getting executed twice but i see the same process instance id.
that means sub workflow invoked only once,

Any retry time cycle needs to be set?

So, you’ve got a multi-instance call activity, which means the sub process will be call n times. Are you saying that if the loop cardinality is n = 1 the subprocess still executes twice?

yes, the behavior observed is subprocess getting called once, as it is invoked job will be created as async is enabled on start event.
once the job is executed, my service tasks gets executed but not failed, it is still in progress. I don’t see any exception
probably engine thinks the job got failed and again it tries to execute the job.
Is there any default respond time like a job/delegate exeution should be completed within specific time

i read below in one of the doc
“In case a service call initiated by Camunda fails, a retry strategy will be used. By default a service task is retried three times”

Does the code take a long time to execute?
it’s possible that the thread is killed if it takes too long and then it would roll back and try again automatically. You could try to increase the thread lock limit

Yes it takes time.

is this the one: property camunda.bpm.job-execution
.lock-time-in-millis Specifies the time in milliseconds an acquired job is locked for execution. During that time, no other job executor can acquire the job. 300000

Can you help me with the right one

job-execution:
  lock-time-in-millis: 900000

i increased the time, but still there is a duplicate job execution.

@Niall

camunda:
bpm:
id-generator: strong
database:
type: mysql
history-level: full
filter:
create: All Tasks
job-execution:
deployment-aware: true
enabled: true
lock-time-in-millis: 900000
core-pool-size: 6
max-pool-size: 10
queue-capacity: 10
max-jobs-per-acquisition: 6

with above properties and with combinations, still i am facing the issue. Any help