Nick - again great questions
You are right, Zeebe is the runtime workflow engine (the command part in CQRS) and Operate with an underlying Elasticsearch is the query part. Zeebe can export all records it processes to Elastic, from where they will be loaded into the Operate Elastic index to serve all the queries you need (so it is eventual consistent).
For pure event tracking I see two possibilities now:
- Run a proper Zeebe broker which just listens to events. As a side product you will be able to see everything in Operate as well. I showed an example of this in the Kafka talk.
- We currently teach https://camunda.com/products/optimize/ to be able to process generic events, do a simple process discovery and provide all visibility and analysis Optimize is already capable of (currently it can only read data from the Camunda engine). We have a working prototype internally and are searching for user that are interested in this, drop me a private mail in case this sounds good, then I can bring you in touch with the product manager of Optimize (as he is really eager to learn about the exact requirements and scenarios).
Currently it is not easily possible to “just” dump the Kafka events into Operate directly. Before you try this I think it is easier to simply dump them into Elasticsearch (or the like) and build your own visualization using https://bpmn.io/.
WDYT?
Bernd