Pentaho

 View Only

Amazon S3 multi part upload error using Pentaho 9.3

This thread has been viewed 4 times
  • 1.  Amazon S3 multi part upload error using Pentaho 9.3

    Posted 03-05-2024 01:32

    I am trying to upload multiple GZip files to Amazon S3 using Pentaho 9.3. I have also set the part size to maximum in the kettle property but am still facing the S3 multi-part error. Reference image for kettle property:

    Currently, I am using Pentaho 7.1 and haven't faced any issues to date. I want to upgrade to a higher version of Pentaho but am constantly getting an 'S3 multipart exception caught' error. I have also increased the JVM RAM, but it is not working. Reference image for S3 Error: 

    I have created a Pentaho job (Sample Job image) and in this flow, it uploads the data to an Amazon S3 bucket. It only creates the S3 file, whose file size is small, and skips the larger file. For example, S3 output delete file size is less (in KB), so every time it just uploads for delete file and not for insert and update files (in MB). Reference for Amazon S3 bucket:



    ------------------------------
    Niti Khamker
    Others
    Inferenz Tech Private Limited
    ------------------------------