Error while variable store

Hi,

I noticed the following error while a Variable with Json value is created in camunda. I’m using MySQL as the Camunda engine db and it looks like it is because the variable content is big and Camunda is unable to store it into the engine database. How can one overcome this?

Cannot instantiate process definition process_Request:1:0462562a-b83d-11e9-b716-005056b34b7e: ENGINE-03083 Exception while executing Batch Database Operations with message ’ ### Error flushing statements. Cause: org.apache.ibatis.executor.BatchExecutorException: org.camunda.bpm.engine.impl.persistence.entity.HistoricVariableInstanceEntity.insertHistoricVariableInstance (batch index #1) failed. Cause: java.sql.BatchUpdateException: Data truncation: Data too long for column ‘TEXT_’ at row 1 ### Cause: org.apache.ibatis.executor.BatchExecutorException: org.camunda.bpm.engine.impl.persistence.entity.HistoricVariableInstanceEntity.insertHistoricVariableInstance (batch index #1) failed. Cause: java.sql.BatchUpdateException: Data truncation: Data too long for column ‘TEXT_’ at row 1 java.sql.BatchUpdateException: Data truncation: Data too long for column ‘TEXT_’ at row 1 com.mysql.cj.jdbc.exceptions.MysqlDataTruncation: Data truncation: Data too long for column ‘TEXT_’ at row 1 '. Flush summary: [ INSERT HistoricVariableInstanceEntity[d6ad1c73-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6cb2bc5-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6cb2bc6-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6cba0f7-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6ccb268-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6ccd979-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6cf4a7a-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6d1e28b-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6d3de5c-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableInstanceEntity[d6d69d7d-b859-11e9-81dc-005056b34b7e] INSERT HistoricJobLogEventEntity[d6d6ebaa-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d7f-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d80-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d81-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d82-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d83-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d84-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d69d85-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d6c496-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d6c497-b859-11e9-81dc-005056b34b7e] INSERT HistoricVariableUpdateEventEntity[d6d6c498-b859-11e9-81dc-005056b34b7e] INSERT HistoricProcessInstanceEventEntity[d6ad1c72-b859-11e9-81dc-005056b34b7e] INSERT HistoricActivityInstanceEventEntity[start_sam:d6ad1c74-b859-11e9-81dc-005056b34b7e] INSERT ExecutionEntity[d6ad1c72-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6ad1c73-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6cb2bc5-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6cb2bc6-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6cba0f7-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6ccb268-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6ccd979-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6cf4a7a-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6d1e28b-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6d3de5c-b859-11e9-81dc-005056b34b7e] INSERT VariableInstanceEntity[d6d69d7d-b859-11e9-81dc-005056b34b7e] INSERT MessageEntity[d6d6eba9-b859-11e9-81dc-005056b34b7e] ]

The Camunda database has a limit of varchar(4000) for TEXT_ field and this prevents larger objects from being stored in the variable history. IMO this is a bug. Changing the table column type to blob type does not solve the issue.
I overcame this issue by splitting the contents of the json to multiple parts and stored in different variables. Before using the json object in my javascript, I concatenated and converted to json object.

Hope this helps someone.