AnsweredAssumed Answered

Pentaho fills all available memory when running a simple job

Question asked by Giuseppe La Rosa on Oct 21, 2018
Latest reply on Dec 5, 2018 by Diego Mainou

Hi everyone,

 

I would like to ask the help of the Pentaho community in solving this issue I have with Pentaho.

I'm using Pentaho 7.1 - Community Edition.

 

The problem I have is that it happens really often to me that Pentaho fills all the available memory and gets stuck when running jobs that contains jobs inside.

 

To better clarify what I mean, I tried to reproduce the issue I encounter using a simple example (you can find it attached to this post).

It is just a simple job (master_job.kjb) that calls two other jobs. They contain 2 transformations each inside.

The first job reads an input file (1 Milion rows, 13 fields - about 90 MB), adds a new field, concatenates two of the fields into a new one and outputs everything to file. The second job reads the previous output, does some string substitution and then outputs a second file.

 

The problem is that once I run this simple job in Pentaho it gets stuck after the execution of the first job, filling all my memory and not advancing anymore to the second job.

 

 

I really cannot understand what it is happening here. To me, this looks like a pretty simple task, but for some reason Pentaho is not able to perform it.

 

FYI: I already did some research and increased the Java Virtual Machine memory in PDI by setting the environment variable "PENTAHO_DI_JAVA_OPTIONS" to "-Xms2048m -Xmx8g -XX:MaxPermSize=1024m" without getting any benefit.

 

I really hope you can give me some help/advice.

 

Bests.

 

 

Giuseppe

Outcomes