Original Message:
Sent: 12-12-2022 17:39
From: Stephen Donovan
Subject: Run the same Job simultaneously with different parameters in Pentaho PDI
Not sure it helps but when I ran (altered your write to log a bit) in version 9.3. I don't see the variable being set at all. I think that this is due to the customerId being a parameter with the default empty value. You may be crossing variable and parameter context. However, I am not seeing it set to the wrong value, as you stated, but not set at all.
Line 68: 2022/12/12 17:36:58 - wtl-customers.0 - customerId =
Line 83: 2022/12/12 17:36:58 - wtl-customers.0 - customerId =
Line 98: 2022/12/12 17:36:58 - wtl-customers.0 - customerId =
Line 118: 2022/12/12 17:36:58 - wtl-companys.0 - customerId =
Line 133: 2022/12/12 17:36:58 - wtl-companys.0 - customerId =
Line 148: 2022/12/12 17:36:58 - wtl-customers.0 - customerId =
Line 163: 2022/12/12 17:36:58 - wtl-sales.0 - customerId =
Line 188: 2022/12/12 17:36:58 - wtl-customers.0 - customerId =
Line 203: 2022/12/12 17:36:58 - wtl-companys.0 - customerId =
Line 218: 2022/12/12 17:36:58 - wtl-companys.0 - customerId =
Line 233: 2022/12/12 17:36:58 - wtl-sales.0 - customerId =
Line 258: 2022/12/12 17:36:58 - wtl-sales.0 - customerId =
Line 283: 2022/12/12 17:36:58 - wtl-companys.0 - customerId =
Line 298: 2022/12/12 17:36:58 - wtl-sales.0 - customerId =
Line 323: 2022/12/12 17:36:58 - wtl-sales.0 - customerId =
In your etl_customer job where you set the variable into the current job (first image), why do you do this instead of simply passing the parameter down to the Process Tables job (where is declared as a parameter (second image)? I understand that this is a simpified version and maybe you removed context that would require this as a variable setting locally instead of the parameter.
When I do this I get the 5 customers Ids uniquely for each sub transform (customers, companys and sales).
Line 68: 2022/12/12 17:20:52 - wtl-customers.0 - customerId = 4
Line 83: 2022/12/12 17:20:52 - wtl-customers.0 - customerId = 3
Line 98: 2022/12/12 17:20:52 - wtl-customers.0 - customerId = 5
Line 118: 2022/12/12 17:20:52 - wtl-companys.0 - customerId = 3
Line 133: 2022/12/12 17:20:52 - wtl-companys.0 - customerId = 4
Line 148: 2022/12/12 17:20:52 - wtl-customers.0 - customerId = 1
Line 163: 2022/12/12 17:20:52 - wtl-sales.0 - customerId = 3
Line 188: 2022/12/12 17:20:52 - wtl-customers.0 - customerId = 2
Line 203: 2022/12/12 17:20:52 - wtl-companys.0 - customerId = 5
Line 218: 2022/12/12 17:20:52 - wtl-companys.0 - customerId = 1
Line 233: 2022/12/12 17:20:52 - wtl-sales.0 - customerId = 4
Line 258: 2022/12/12 17:20:52 - wtl-sales.0 - customerId = 5
Line 283: 2022/12/12 17:20:52 - wtl-companys.0 - customerId = 2
Line 296: 2022/12/12 17:20:52 - wtl-sales.0 - customerId = 1
Line 323: 2022/12/12 17:20:52 - wtl-sales.0 - customerId = 2
------------------------------
Stephen Donovan
Digital Solutions Architect
Hitachi Vantara
Original Message:
Sent: 12-12-2022 07:36
From: Stevens Silva
Subject: Run the same Job simultaneously with different parameters in Pentaho PDI
I created a project as a repository of example files on how I create variables and pass them as parameters and also the defined scope. The project is much simpler, but it gives a good example of the problem, that the variables are with the value of the last execution when starting by the main job "start"
------------------------------
Stevens Silva
Chief Technology Officer
Solcast
Original Message:
Sent: 12-07-2022 23:24
From: Stephen Donovan
Subject: Run the same Job simultaneously with different parameters in Pentaho PDI
If you could include a simplified version that shows the variables in scope (a simple Write to log) of the subtransforms along with your settings the community can try to test this in various versions/environments and see if we can reproduce or fix what you are seeing.
------------------------------
Stephen Donovan
Digital Solutions Architect
Hitachi Vantara
Original Message:
Sent: 12-07-2022 11:31
From: Stevens Silva
Subject: Run the same Job simultaneously with different parameters in Pentaho PDI
Thanks Stephen.I had already tried the scope, but it seems that he was not respecting it. I did a new test with a sub Job receiving as parameters and for each client I initialize all the variables that this sub Job will receive as a parameter with the scope of the current Job. There are still some problems that variables end up being shared between running jobs, as if he lost the reference and did not isolate them.
Example Job Customer
------------------------------
Stevens Silva
Chief Technology Officer
Solcast
Original Message:
Sent: 12-07-2022 11:05
From: Stephen Donovan
Subject: Run the same Job simultaneously with different parameters in Pentaho PDI
It would work in Kitchen because each execution is a new JVM. It sounds like the variable scope that you are setting is at the JVM level. You will want to change that to set the proper context to the job itself (which will pass down) or the parant job if you are setting this in a child transformation or job. Avoid JVM level in almost every case.
If you think you have done this properly include a test transformation/job for us to review.
------------------------------
Stephen Donovan
Digital Solutions Architect
Hitachi Vantara
Original Message:
Sent: 12-06-2022 08:07
From: Stevens Silva
Subject: Run the same Job simultaneously with different parameters in Pentaho PDI
I developed a Job that receives the Customer Id as a parameter and loads the connection data from the database, so as not to have to duplicate codes and have a greater effort in maintenance.
However, when executing the Job simultaneously through a main Job, both the parameters and the variables with the scope to be of the current Job, keep the value of the last Job started.
If you trigger the Job through Kitchen via command lines, it works perfectly, isolating the variables. However, it is extremely slow and consumes much more resource from the machine, due to having several instances of Pentaho running, even limiting the JVM memory usage to 2GB, being an 8GB machine with 4 vCPUs. I tested with 3 jobs running simultaneously.
I would like to know if anyone has had a similar problem and how they resolved it?
------------------------------
Stevens Silva
Chief Technology Officer
Solcast
------------------------------