My usecase -
Incoming multiple events (via kafka sink connect) viz. e1, e2, e3, e4.
Group these event’s data/variables in an array like [e1, e2, e3, e4] and pass on to the next block in the workflow.
I need to group multiple event’s data because this data will be sent to a 3rd party webhook which accepts payloads in bulk.
Is it doable in zeebe workflow? If yes, how can I model it? Thanks in advance.
@piyush2206 you can always use a worker to do that, and set the output mapping for the service task to return an array. Does that work for you?
@salaboy thanks for the reply.
A zeebe worker usually receives a single message, it will process it and do the output mapping.
But in my case worker has to read say 100 messages from kafka, put them in 1 array and do the output mapping. Can we design this?
first, welcome to the Zeebe community
Yes, it is possible to collect the variables of incoming messages. There are different ways on how to do this:
- using a job worker to collect the variables (as @salaboy suggested)
- using variable mappings in the workflow, for example, an output mapping on the message catch event
Output mappings can be used to create or update variables. The value of the variable is calculated by a FEEL expression.
The FEEL function
append(list, item) can be used to append an item to an existing list.
source: =append(payloads, messagePayload)
payloads should be created as an empty list before.
Example workflow: BPMN with collecting message variables · GitHub
In any case, the variable should not get too big. If the variable size is greater than 1 MB or the list contains a lot of items then you should think of store it externally.