In deciding which architecture to go for, we need some support.
Some deciding factors
we are Kotlin + Springboot shop
we would like to use Java API as much possible
we want to keep the workflow engine separate from our application
We tried to run the Camunda as separate app and then communicate via REST API, but REST API are not fun to use especially the serializization and deserilization is something we are not liking.
Like seeing the JSON format the object needs to be Variables in the REST API | docs.camunda.org , it looks a bit non standard so do we write our own deserialization and serialization module? Thats why would have been nice if we can just use JAVA Fluent API and all its done in the background for us.
Could someone throw some light on which direction to go and what is the best approach.
Sorry if its too noob question, as we are just getting started with it.
So the engine does indeed need to be part of the application, but you can consider it nothing more than adding a dependency to your spring boot project.
The process models themselves don’t actually need to be kept in the application server they can still be deployed to it independently via the REST API and subsequently would be stored in the Database not the application.
In the “embedded camunda process engine in the existing spring-boot application” approach - if we deploy 4 or 8 instances of this service, will all of those instances have their own process engine and hence their own cockpit and tasklist?
In this case, is REST API with external tasks with a standalone process engine the only option?
Not necessarily, the engine and it’s front ends are independent so if you wanted to you could create nodes that are intended only for external tasks to register to get work while others can be dedicated to front end users.
@Niall thanks for the reply.
But let’s say one node goes down then the process instances running on that node also will be lost right?
Can multiple process engine share the same database and same frontend? If yes please let us know how to do that. In that case can other nodes pick up the process instance and complete the task?
Since we perform immutable deployment, the next time we deploy to a new VM the old data from the process engine will be gone as well.
There could be an alternative for you: Home
Is a community extension that implements the Camunda Java api via rest. So you are implanting against the api, but calls are sent to a remote engine via rest. It is not feature complete yet, so you would have to verify that your use cases are supported.
Nope, no process data is stored on a the node itself. it’s stored in the database. you could have more than one node using the same database. So if one goes down the other will still be able to do the work.
This isn’t a problem and happens out of the box if you just startup two nodes pointing at the same database there’s no additional work needed.
It depends on you deploy things. If you’re deploying a process model it’ll be stored in the databse so it doesn’t matter if the node where the engine is goes down as long as the database always persists the data.