I have a process in which I have used a timer event. The timer event is of type Duration and has wait time of 30 minutes(PT30M). This timer event is expected to end exactly after 30 minutes, but it takes additional 15 or 30 seconds. This behavior is observed and the delay is exactly 15 or 30 seconds every time even if I change (increase or decrease) the duration of timer event. I would like to know why does it take 15 seconds extra then required to execute.
This has been discussed in an older thread - if you don’t find the answer there feel free to ask again here.