Pentaho

 View Only

 How to add custom fields to pentaho logging tables

  • Pentaho
  • Kettle
  • Pentaho
  • Pentaho Data Integration PDI
Aditya Kanthe's profile image
Aditya Kanthe posted 02-13-2019 13:25

Hi,

I am trying to add some custom fields to pentaho logging tables and I want the values of those columns to be populated when the insert ln logging tables is happening. Can someone please guide me on this.

Thanks


#PentahoDataIntegrationPDI
#Pentaho
#Kettle
Roguen Keller's profile image
Roguen Keller

Hello Aditya Kanthe

Can you provide some more details. That may prompt some more and better responses.

  • What version of PDI/spoon are you using
  • Are you using some specific steps right now and can you give us an idea of your configuration
Aditya Kanthe's profile image
Aditya Kanthe

Hello Roguen KellerI am using PDI 8.0. I have enabled database logging by setting up following variables in kettle.properties

KETTLE_TRANS_LOG_DB

KETTLE_JOB_LOG_TABLE

I have added to additional fields in these tables namely company_id and import_Id. I am processing files placed in FTP location using a for each loop step. So company_id and import_id changes based on file, and I want these values to be inserted at the time when log values are inserted because afterwards it's hard for me to identify which log record is for which comany_id and import_id combination.

Sparkles Sparkles's profile image
Sparkles Sparkles

I thought the default log table (tables?) was made generic for all transformations? Or is it 1 table per transformation? What would your columns be for other transformations, null?

Sounds like you need to edit the source code and compile your own version if you want different behavior from the standard logging. I found that it was easier to abandon the entire logging system and make my own instead. I still use standard logging system for errors (if/when they happen), but for all other logs, I handle it manually through steps.

Aditya Kanthe's profile image
Aditya Kanthe

Hmm, I was thinking of extending behavior of logging classes like StepLogTable and JobLogTable, but those classes have private constructors so no luck there. TransMeta class is also using concrete class variables of StepLogTable and JobLogTable classes so no luck there as well. I guess I have only two options as you said, either do custom logging by using steps or compile source code by making changes. Although I think it would be good if constructors for classes like StepLogTable and JobLogTable are made protected.

Ana Gonzalez's profile image
Ana Gonzalez

I agree with Sparkles Sparkles, those are generic tables, not all the jobs or steps are related to "FTPing" files, so it makes more sense to insert that kind of information in different tables and build some logic (no need to compile your own version of Pentaho) in your transformations and jobs to insert in those tables when you are doing the FTP.

Regards