Hello All.
I'm facing a problem when I run a transformation in Pentaho. I'm using SQL Server in Cloud (Azure). The job takes 5 minutes to load 9k rows. I've set up a few configuration such as "Use batch update for inserts" on table output and parameter useBulkCopyForBatchInsert = true on JDBC Connection. When I changed the "size commit" on output table it ran faster but I got the error below.
2018/10/08 09:41:45 - Table output.0 - ERROR : Unexpected batch update error committing the database connection.
2018/10/08 09:41:45 - Table output.0 - ERROR: org.pentaho.di.core.exception.KettleDatabaseBatchException:
2018/10/08 09:41:45 - Table output.0 - Error updating batch
2018/10/08 09:41:45 - Table output.0 - I/O Error: Connection reset by peer: socket write error
2018/10/08 09:41:45 - Table output.0 - at org.pentaho.di.core.database.Database.emptyAndCommit(Database.java:1278)
2018/10/08 09:41:45 - Table output.0 - ERROR: Unexpected error rolling back the database connection.
2018/10/08 09:41:45 - Table output.0 - ERROR : org.pentaho.di.core.exception.KettleDatabaseException:
2018/10/08 09:41:45 - Table output.0 - Error performing rollback on connection
2018/10/08 09:41:45 - Table output.0 - Invalid state, the Connection object is closed.
I really appreciate any help.
Thanks in advance.
Raphael