AnsweredAssumed Answered

Loading files in loop lead to out of memory issue

Question asked by Aleksandr Kirillov on Feb 13, 2018
Latest reply on Feb 15, 2018 by Diego Mainou


Hi,

I have simple job that load e.g. csv files from folder into DB(Oracle). 2 transformation, first get all filenames and copy it to result, next transformation in a loop read all files and output into DB table. CSV files pretty simple (1-2Kb) some of them can be even empty. After around 300 files job start to work really slow... and soon completely stops with "out of memory" error. I've set 8GB for spoon. Ideally after each file pdi should clean memory.. but in my case it seems that it keeps all data in memory. I've tried to run the job on old version of pdi (4.4) it can process more files, but in the end job will crash...  Currently i have to limit number of incoming files and then run the job number of times.

Hope that someone can help me with this issue.

Outcomes