Pentaho

 View Only

 Multiple table inputs from source and single output

  • Pentaho
  • Pentaho
Muhammad Adeel Khan's profile image
Muhammad Adeel Khan posted 07-28-2019 18:50

Currently i am working on building a transformation where i have to read all the tables in a particular database in MS SQL server 2012. Those tables could be as many as 20 in some cases, so i am using 20 different table inputs in my transformation, then i have to check columns in each table and then perform operations like field splitting on delimiters, string operations, date conversions etc before appending and then finally loading all of these tables in one Postgresql table. The columns in each table are different

I just wanted to check is there any better way where instead of having 20 different table inputs, i can use one input which reads all the tables in the database and then analyze the columns in each of those tables. Attached please find the picture of that transformations as below, just want to reduce the number of steps to make it more dynamic, any suggestions please  
#Pentaho
Ana Gonzalez's profile image
Ana Gonzalez

No image attached, anyway, the answer would be Metadata Injection, at least to read the tables, querying the catalog tables in the database to return all the tables you need and their columns would be automatic.

But to perform different operations on different columns you'll need a pattern on the columns to continue using Metadata Injection, or to create some files or tables on the database to indicate: on this column table1.column1 I need to perform a split operation using character "x" to split, and keep the results on fields field1, field2, etc.

Regards