We have a lot of transformations (the most are Dimension lookups and some are fact-lookups) that are running in parallel. Every transformation can run 4 times in parallel.
We achieve that with a "Job executor" Task that start 4 parallel instances. In the JobExecutor are all the transformations.
With this setup it can be, that the Transformation X (which update table X) can run 1,2,3 or 4 times in parallel.
So nothing special, everything is ok with that.
Now, we have the problem (since a short time) that we get a deadlock error in the Transformation Log Table (trans_log):
"There was an error calculating the change data capture date range, it probably involved log table PDI_TRANS_LOG.
Couldn't get row from result set
Transaction (Process ID 53) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."
The Transformation Log Table, user and password is defined in the kettle.properties file. The Database for the logging is a Microsoft SQL 2017 Server on a Linux machine. The database for the lookups and the logging are not the same.
We already implemented the Advaned option in the Database Connection to use a MSSQL logging seq (SEQUENCE_FOR_BATCH_ID). That works fine.
We also tried to disable the logging field "startdate" in the Transformation Log, same error again.
Can anybody help with a hint?
#Pentaho#PentahoDataIntegrationPDI#Kettle