I have a complex Business process to be modelled on the system. To do this we are eliminating a lot of manual tasks done in the business process and only capturing the ones that are necessary for the system.
Since I am doing this change I was thinking about modelling the whole process into chunks of multiple processes. It would be great to understand if I make these individual chinks as Sub Processes of the main process as compared to different individual Processes.
contrasting to Nialls approach I would start with one, maybe huge, process model. With the Camunda modeller it is very easy to navigate through even huge process models, especially with the minimap.
I’ve seen peoples who were lost in nested processes (they modeled six nested levels with call activities).
I only start to split the process, when I can really reuse a part of the process.
We are in a state where we expect the process to have a lot of flux. Especially the ones that I am trying to separate out. Having them as separate entities, localises my migration in case of a process change. So rather than migrating all the tokens in the system, I would just have to do it with the process where the change affects.
I am evaluating the pros and cons of these approaches and it would be helpful to have your opinion on it
I like to follow the idea of having one orchestrator per pool (process).
If part of the process is owned and managed by own authority I put it in separate process.
Otherwise you actually imply there is one authority being responsible for everything which is often not the case.
It was interesting to understand the concept of separation by ownership. In our case, we are concerned about accommodating the flux in the process definition. The ownership problem is of less importance right now (I understand that it’ll become something big soon though).
How do you suggest will the division would help managing change in the definition?
I would suggest you consider thinking of processes as a series of stages with steps within them. Then when you are modeling it makes it easy to follow the process and use call activities where you need to break them off logically or instantiate multiple runs of flows. This is what I think @Niall is saying - which I would agree with.
The other value of breaking down the processes into sub-processes via Call Activity’s is simply for readability. Reading through a thousand steps with dozens of branches is very difficult. So to me this is treating BPMN in the same manor as OOP in programming. Creating massive methods/classes would be frowned upon, even in an application that is influx. As a general standard, my team considers roughly 20-30 flow nodes a point to start thinking of breaking into a sub-process.
I’ve used these concepts for my current Camunda implementation and for PegaSystems applications. It seems to work for my teams.
Now I understand a good approach to take and points to address. Call Activity seems to be an interesting approach. I’ll start on it and share my feedback here.