Hi All,
Could you please help me in finding a solution to a problem.
I'm using Pentaho Kettle v 6.1 for my data integration. Source sytem would be json file whose size would vary based on the data load on the source system from 10kb to 1gb or more ( in peak hours) per 15 mins.
Could you please let me know if it is feasible to create transformations/jobs to load 1gb size file via pentaho?? If yes then what would be the best approach and if not then how can I achieve this requirement.??
Please help!
Regards,
Kapil Bhardwaj
I have not tried with 1GB file, but worked with around 100 MB file. Start with JSON input, if you face any specific problem you can post in the forums.