Pentaho

 View Only

 How to run Kettle jobs on CE Pentaho Server?

  • Pentaho
  • Kettle
  • Pentaho
  • Pentaho Data Integration PDI
Kate Rakel's profile image
Kate Rakel posted 08-29-2019 08:35

I want to run Kettle jobs on Pentaho Server using this article: https://stackoverflow.com/questions/35705921/how-to-deploy-scheduled-kettle-jobs-on-pentaho-bi-server-v6-ce

I've created transform step1.ktr and added it to job Toserver.kjb

My job exported records form one Salesforce org to another.

In job I've created the parameter (path on Server)

tr1

And input the corespondent path to the transform

tr2

After that I've created xaction file which executes Kettle job on BI server (daily.xaction). I've copied this file from article above and change only job name.

Next I uploaded all my files to Pentaho server in etl folder and scheduled "daily" file

etl_folder

After scheduling time in Kettle folder has been created error file with text:

org.pentaho.platform.api.engine.ActionExecutionException: RuntimeContext.ERROR_0017 - Action failed to execute

 

What have I done wrong? And how to run jobs on Pentaho Server?

 


#Pentaho
#PentahoDataIntegrationPDI
#Kettle
David da Guia Carvalho's profile image
David da Guia Carvalho

In pentaho 7 > you can do that without the xaction. Just deploy the kjb/ktr and use the schedule.

Kate Rakel's profile image
Kate Rakel

Thanks, I've run my job. May be also you can say where I can check some logs about this transform or job status?

David da Guia Carvalho's profile image
David da Guia Carvalho

If you are running (wihtout a remote execution in a pdi server/carte) and without any extra configuration the, log should be at catalina.out on your BA Server.

If you need more control over your logs, i suggest you to set database log in the transformation/job.

Kate Rakel's profile image
Kate Rakel

Thanks a lot

T N's profile image
T N

Can we also pass parameters to jobs while scheduling on pentaho bi server release 7?