Insert into ACT_HI_VARINST suddenly started failing due to Data truncation error

Hello everyone!

I use the REST API to create process instances, and suddenly the processes stopped being created. I am not trying to save large objects, so I have no idea where the truncation error comes from.

Any help is greatly appreciated. The logs are as follows:

26-Nov-2019 17:48:29.115 SEVERE [pool-2-thread-7] org.camunda.commons.logging.BaseLogger.logError ENGINE-16004 Exception while closing command context: ENGINE-03004 Exception while executing Database Operation 'INSERT HistoricVariableInstanceEntity[1991b079-108e-11ea-9391-0242ac180007]' with message '
### Error updating database.  Cause: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'TEXT_' at row 1
### The error may involve org.camunda.bpm.engine.impl.persistence.entity.HistoricVariableInstanceEntity.insertHistoricVariableInstance-Inline
### The error occurred while setting parameters
### SQL: insert into ACT_HI_VARINST (       ID_,       PROC_DEF_KEY_,       PROC_DEF_ID_,       PROC_INST_ID_,       EXECUTION_ID_,       ACT_INST_ID_,       TENANT_ID_,       CASE_DEF_KEY_,       CASE_DEF_ID_,       CASE_INST_ID_,       CASE_EXECUTION_ID_,       TASK_ID_,       NAME_,       REV_,       VAR_TYPE_,       BYTEARRAY_ID_,       DOUBLE_,       LONG_,       TEXT_,       TEXT2_     ) values (       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?     )
### Cause: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'TEXT_' at row 1'. Flush summary: 
 [
  
	INSERTS...
]
 org.camunda.bpm.engine.ProcessEngineException: ENGINE-03004 Exception while executing Database Operation 'INSERT HistoricVariableInstanceEntity[1991b079-108e-11ea-9391-0242ac180007]' with message '
### Error updating database.  Cause: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'TEXT_' at row 1
### The error may involve org.camunda.bpm.engine.impl.persistence.entity.HistoricVariableInstanceEntity.insertHistoricVariableInstance-Inline
### The error occurred while setting parameters
### SQL: insert into ACT_HI_VARINST (       ID_,       PROC_DEF_KEY_,       PROC_DEF_ID_,       PROC_INST_ID_,       EXECUTION_ID_,       ACT_INST_ID_,       TENANT_ID_,       CASE_DEF_KEY_,       CASE_DEF_ID_,       CASE_INST_ID_,       CASE_EXECUTION_ID_,       TASK_ID_,       NAME_,       REV_,       VAR_TYPE_,       BYTEARRAY_ID_,       DOUBLE_,       LONG_,       TEXT_,       TEXT2_     ) values (       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?     )
### Cause: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'TEXT_' at row 1'. Flush summary: 
 [
  INSERTS.... 
]
	at org.camunda.bpm.engine.impl.db.EnginePersistenceLogger.flushDbOperationException(EnginePersistenceLogger.java:115)
	at org.camunda.bpm.engine.impl.db.entitymanager.DbEntityManager.flushDbOperationManager(DbEntityManager.java:305)
	at org.camunda.bpm.engine.impl.db.entitymanager.DbEntityManager.flush(DbEntityManager.java:281)
	at org.camunda.bpm.engine.impl.interceptor.CommandContext.flushSessions(CommandContext.java:203)
	at org.camunda.bpm.engine.impl.interceptor.CommandContext.close(CommandContext.java:132)
	at org.camunda.bpm.engine.impl.interceptor.CommandContextInterceptor.execute(CommandContextInterceptor.java:113)
	at org.camunda.bpm.engine.impl.interceptor.ProcessApplicationContextInterceptor.execute(ProcessApplicationContextInterceptor.java:66)
	at org.camunda.bpm.engine.impl.interceptor.LogInterceptor.execute(LogInterceptor.java:30)
	at org.camunda.bpm.engine.impl.jobexecutor.ExecuteJobHelper.executeJob(ExecuteJobHelper.java:36)
	at org.camunda.bpm.engine.impl.jobexecutor.ExecuteJobHelper.executeJob(ExecuteJobHelper.java:29)
	at org.camunda.bpm.engine.impl.jobexecutor.ExecuteJobsRunnable.executeJob(ExecuteJobsRunnable.java:88)
	at org.camunda.bpm.engine.impl.jobexecutor.ExecuteJobsRunnable.run(ExecuteJobsRunnable.java:57)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Caused by: org.apache.ibatis.exceptions.PersistenceException: 
### Error updating database.  Cause: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'TEXT_' at row 1
### The error may involve org.camunda.bpm.engine.impl.persistence.entity.HistoricVariableInstanceEntity.insertHistoricVariableInstance-Inline
### The error occurred while setting parameters
### SQL: insert into ACT_HI_VARINST (       ID_,       PROC_DEF_KEY_,       PROC_DEF_ID_,       PROC_INST_ID_,       EXECUTION_ID_,       ACT_INST_ID_,       TENANT_ID_,       CASE_DEF_KEY_,       CASE_DEF_ID_,       CASE_INST_ID_,       CASE_EXECUTION_ID_,       TASK_ID_,       NAME_,       REV_,       VAR_TYPE_,       BYTEARRAY_ID_,       DOUBLE_,       LONG_,       TEXT_,       TEXT2_     ) values (       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?,       ?     )
### Cause: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'TEXT_' at row 1
	at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:26)
	at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java:154)
	at org.apache.ibatis.session.defaults.DefaultSqlSession.insert(DefaultSqlSession.java:141)
	at org.camunda.bpm.engine.impl.db.sql.DbSqlSession.executeInsertEntity(DbSqlSession.java:146)
	at org.camunda.bpm.engine.impl.db.sql.DbSqlSession.insertEntity(DbSqlSession.java:138)
	at org.camunda.bpm.engine.impl.db.AbstractPersistenceSession.executeDbOperation(AbstractPersistenceSession.java:41)
	at org.camunda.bpm.engine.impl.db.entitymanager.DbEntityManager.flushDbOperationManager(DbEntityManager.java:303)
	... 13 more
Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too longfor  column 'TEXT_' at row 1
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4072)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4006)
	at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2468)
	at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2629)
	at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2719)
	at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2155)
	at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1379)
	at org.apache.ibatis.executor.statement.PreparedStatementHandler.update(PreparedStatementHandler.java:44)
	at org.apache.ibatis.executor.statement.RoutingStatementHandler.update(RoutingStatementHandler.java:69)
	at org.apache.ibatis.executor.SimpleExecutor.doUpdate(SimpleExecutor.java:48)
	at org.apache.ibatis.executor.BaseExecutor.update(BaseExecutor.java:105)
	at org.apache.ibatis.executor.CachingExecutor.update(CachingExecutor.java:71)
	at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java:152)
	... 18 more

Hi,
You may be running into a String length constraint as per docs here

String length restriction - string values are stored in the database in a column of type (n)varchar , with a length restriction of 4000 (2000 for Oracle)…

Identify a process variable of type String which may be exceeding this size.
Consider not storing it in history (eg delete the variable before its persisted, or change history resolution).
Consider storing it as a serialized object or byteArray…

regards

Rob

2 Likes

Hi Rob! Thank you very much, that was it.

A JSON Object from a web service ended up growing much more than expected, so it took me by surprise, specially since the troublesome table was supposed to contain only historical data. Now, I agree that my understanding of Camunda’s data model is not too deep, and still haven’t dived a lot into it.

I ended up giving the data a proper JSON serialization using the spin plugin, and took the opportunity to trim it into only the useful bits and only then persist it.

Thanks again!

Cheers,

David.