Hi,
I struggled with storing large data into the process variable via REST, so I tested it in Java directly.
ObjectValue stringValueAsJavaObject = Variables.objectValue(largeStringValue)
.serializationDataFormat(Variables.SerializationDataFormats.JAVA)
.create();processEngine.getRuntimeService().setVariable(processId, varName, stringValueAsJavaObject);
This code stores strings as blob into the db (that what I want to do achieve).
Now the “but”: If I try to complete a task within this process instance I get 500 internal error with following stacktrace:
Cannot complete task c2e259d1-3c13-11e9-8e82-0242ac150005: ENGINE-03083 Exception while executing Batch Database Operations with message '
### Error flushing statements. Cause: org.apache.ibatis.executor.BatchExecutorException: org.camunda.bpm.engine.impl.persistence.entity.HistoricTaskInstanceEntity.insertHistoricTaskInstanceEvent (batch index #5) failed. 4 prior sub executor(s) completed successfully, but will be rolled back. Cause: java.sql.BatchUpdateException: Batch entry 0 insert into ACT_HI_TASKINST (
ID_,
PROC_DEF_KEY_,
PROC_DEF_ID_,
ROOT_PROC_INST_ID_,
PROC_INST_ID_,
EXECUTION_ID_,
CASE_DEF_KEY_,
CASE_DEF_ID_,
CASE_INST_ID_,
CASE_EXECUTION_ID_,
ACT_INST_ID_,
NAME_,
PARENT_TASK_ID_,
DESCRIPTION_,
OWNER_,
ASSIGNEE_,
START_TIME_,
END_TIME_,
DURATION_,
DELETE_REASON_,
TASK_DEF_KEY_,
PRIORITY_,
DUE_DATE_,
FOLLOW_UP_DATE_,
TENANT_ID_,
REMOVAL_TIME_
) values (
'cf998252-3c13-11e9-8e82-0242ac150005',
'Z_Text',
'Z_Text:1:e4ec82cb-3c0c-11e9-8e82-0242ac150005',
'c2d9811a-3c13-11e9-8e82-0242ac150005',
'c2d9811a-3c13-11e9-8e82-0242ac150005',
'cf8b516c-3c13-11e9-8e82-0242ac150005',
NULL,
NULL,
NULL,
NULL,
'Task_1m38hn6:cf8b516d-3c13-11e9-8e82-0242ac150005',
'...LONG...STRING',
NULL,
NULL,
NULL,
'wf1@test.com',
'2019-03-01 11:18:59.656000 +00:00:00',
NULL,
NULL,
NULL,
'Task_1m38hn6',
50,
NULL,
NULL,
NULL,
NULL
) was aborted. Call getNextException to see the cause.
... org.postgresql.util.PSQLException: ERROR: value too long for type character varying(255)
It looks like that my large process variable is not handled right, when completing the task or what I am doing wrong here?
Best Regards
Alex