2023/11/23 06:15:07 - Kitchen - Logging is at level : Debug 2023/11/23 06:15:07 - Kitchen - Start of run. 2023/11/23 06:15:07 - Kitchen - Allocate new job. 2023/11/23 06:15:07 - Kitchen - Parsing command line options. 2023/11/23 06:15:14 - start_all_facts - Start of job execution 2023/11/23 06:15:14 - start_all_facts - exec(0, 0, START.0) 2023/11/23 06:15:14 - START - Starting job entry 2023/11/23 06:15:14 - start_all_facts - Starting entry [Process Destination System and Job Type Filter] 2023/11/23 06:15:14 - start_all_facts - exec(1, 0, Process Destination System and Job Type Filter.0) 2023/11/23 06:15:14 - Process Destination System and Job Type Filter - Starting job entry 2023/11/23 06:15:14 - Process Destination System and Job Type Filter - Opening transformation: [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/get_parameters.ktr] 2023/11/23 06:15:15 - Process Destination System and Job Type Filter - Starting transformation...(file=${Internal.Job.Filename.Directory}/get_parameters.ktr, name=Process Destination System and Job Type Filter, repinfo=null) 2023/11/23 06:15:16 - get_destination_system_filter - Transformation is pre-loaded. 2023/11/23 06:15:16 - get_destination_system_filter - nr of steps to run : 3 , nr of hops : 2 2023/11/23 06:15:16 - get_destination_system_filter - Dispatching started for transformation [get_destination_system_filter] 2023/11/23 06:15:16 - get_destination_system_filter - Nr of arguments detected:0 2023/11/23 06:15:16 - get_destination_system_filter - This is not a replay transformation 2023/11/23 06:15:16 - get_destination_system_filter - I found 3 different steps to launch. 2023/11/23 06:15:16 - get_destination_system_filter - Allocating rowsets... 2023/11/23 06:15:16 - get_destination_system_filter - Allocating rowsets for step 0 --> Generate Destination System Filter 2023/11/23 06:15:16 - get_destination_system_filter - prevcopies = 1, nextcopies=1 2023/11/23 06:15:16 - get_destination_system_filter - Transformation allocated new rowset [Generate Destination System Filter.0 - Set Variables for destination system handling.0] 2023/11/23 06:15:16 - get_destination_system_filter - Allocated 1 rowsets for step 0 --> Generate Destination System Filter 2023/11/23 06:15:16 - get_destination_system_filter - Allocating rowsets for step 1 --> Set Variables for destination system handling 2023/11/23 06:15:16 - get_destination_system_filter - Allocated 1 rowsets for step 1 --> Set Variables for destination system handling 2023/11/23 06:15:16 - get_destination_system_filter - Allocating rowsets for step 2 --> Generate Rows to run JS once 2023/11/23 06:15:16 - get_destination_system_filter - prevcopies = 1, nextcopies=1 2023/11/23 06:15:16 - get_destination_system_filter - Transformation allocated new rowset [Generate Rows to run JS once.0 - Generate Destination System Filter.0] 2023/11/23 06:15:16 - get_destination_system_filter - Allocated 2 rowsets for step 2 --> Generate Rows to run JS once 2023/11/23 06:15:16 - get_destination_system_filter - Allocating Steps & StepData... 2023/11/23 06:15:16 - get_destination_system_filter - Transformation is about to allocate step [Generate Destination System Filter] of type [ScriptValueMod] 2023/11/23 06:15:16 - get_destination_system_filter - Step has nrcopies=1 2023/11/23 06:15:16 - Generate Destination System Filter.0 - distribution activated 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:16 - Generate Destination System Filter.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:16 - Generate Destination System Filter.0 - input rel is 1:1 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Found input rowset [Generate Rows to run JS once.0 - Generate Destination System Filter.0] 2023/11/23 06:15:16 - Generate Destination System Filter.0 - output rel. is 1:1 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Found output rowset [Generate Destination System Filter.0 - Set Variables for destination system handling.0] 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Finished dispatching 2023/11/23 06:15:16 - get_destination_system_filter - Transformation has allocated a new step: [Generate Destination System Filter].0 2023/11/23 06:15:16 - get_destination_system_filter - Transformation is about to allocate step [Set Variables for destination system handling] of type [SetVariable] 2023/11/23 06:15:16 - get_destination_system_filter - Step has nrcopies=1 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - distribution activated 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Step info: nrinput=1 nroutput=0 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - input rel is 1:1 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Found input rowset [Generate Destination System Filter.0 - Set Variables for destination system handling.0] 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Finished dispatching 2023/11/23 06:15:16 - get_destination_system_filter - Transformation has allocated a new step: [Set Variables for destination system handling].0 2023/11/23 06:15:16 - get_destination_system_filter - Transformation is about to allocate step [Generate Rows to run JS once] of type [RowGenerator] 2023/11/23 06:15:16 - get_destination_system_filter - Step has nrcopies=1 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - distribution activated 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Step info: nrinput=0 nroutput=1 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - output rel. is 1:1 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Found output rowset [Generate Rows to run JS once.0 - Generate Destination System Filter.0] 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Finished dispatching 2023/11/23 06:15:16 - get_destination_system_filter - Transformation has allocated a new step: [Generate Rows to run JS once].0 2023/11/23 06:15:16 - get_destination_system_filter - This transformation can be replayed with replay date: 2023/11/23 06:15:16 2023/11/23 06:15:16 - get_destination_system_filter - Initialising 3 steps... 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Released server socket on port 0 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Released server socket on port 0 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Released server socket on port 0 2023/11/23 06:15:16 - get_destination_system_filter - Step [Generate Destination System Filter.0] initialized flawlessly. 2023/11/23 06:15:16 - get_destination_system_filter - Step [Set Variables for destination system handling.0] initialized flawlessly. 2023/11/23 06:15:16 - get_destination_system_filter - Step [Generate Rows to run JS once.0] initialized flawlessly. 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Starting to run... 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Starting to run... 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Starting to run... 2023/11/23 06:15:16 - get_destination_system_filter - Transformation has allocated 3 threads and 2 rowsets. 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:16 - Generate Rows to run JS once.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0) 2023/11/23 06:15:16 - Generate Destination System Filter.0 - This script is using 0 values from the input stream(s) 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Optimization level set to 9. 2023/11/23 06:15:16 - Generate Destination System Filter.0 - No starting Script found! 2023/11/23 06:15:16 - Generate Destination System Filter.0 - No tran_Status found. Transformation status checking not available. 2023/11/23 06:15:16 - Generate Destination System Filter.0 - No end Script found! 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:16 - Generate Destination System Filter.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Setting environment variables... 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Set variable SOURCESYSTEMFILENAMESUFFIX to value [TDS] 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Set variable DESTINATIONSYSTEMFILTER to value ['S3'] 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Set variable DESTINATIONSYSTEMFILENAMESUFFIX to value [S3] 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Set variable JOBTYPEFILTER to value [SCHEDULED] 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Finished after 1 rows. 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Signaling 'output done' to 0 output rowsets. 2023/11/23 06:15:16 - Set Variables for destination system handling.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:16 - start_all_facts - Starting entry [Safe to continue?] 2023/11/23 06:15:16 - start_all_facts - exec(2, 0, Safe to continue?.0) 2023/11/23 06:15:16 - Safe to continue? - Starting job entry 2023/11/23 06:15:16 - Safe to continue? - Found 0 previous result rows 2023/11/23 06:15:16 - Safe to continue? - Running on platform : Linux 2023/11/23 06:15:16 - Safe to continue? - Executing command : /tmp/kettle_abcc8acb-89c7-11ee-8bfd-57d772e11e34shell sync_facts TDS S3 SCHEDULED 2023/11/23 06:15:16 - Safe to continue? - The variable awt.toolkit has a total of 20 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Step.CopyNr has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.specification.version has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_REPOSITORY has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.cpu.isalist has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.jnu.encoding has a total of 5 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.arch.data.model has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable ACCESS_KEY has a total of 20 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS.PASSWORD has a total of 43 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vendor.url has a total of 23 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Step.Unique.Count has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable SCRIPT_SUFFIX has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS.PORTNUMBER has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable REDSHIFT.JDBC.CONNECTION has a total of 68 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable JOBTYPEFILTER has a total of 9 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.boot.library.path has a total of 36 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.java.command has a total of 581 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable org.osjava.sj.delimiter has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable NEW_ARN has a total of 79 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.specification.vendor has a total of 18 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP-MASTER.PASSWORD has a total of 43 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.home has a total of 26 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP.PORTNUMBER has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable file.separator has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS.HOSTNAME has a total of 12 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable line.separator has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable vfs.sftp.userDirIsRoot has a total of 5 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.specification.name has a total of 31 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.specification.vendor has a total of 18 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Job.Name has a total of 15 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable JOBS_PROCESSED_PER_TABLE has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable ARN has a total of 79 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.boot.class.path has a total of 325 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.management.compiler has a total of 31 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.runtime.version has a total of 13 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable user.name has a total of 5 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable SECRET_KEY has a total of 40 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS-ADHOC.PORTNUMBER has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable DESTINATION_SYSTEM has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP.USERNAME has a total of 7 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable DC_SHARD_ID has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable file.encoding has a total of 5 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP.HOSTNAME has a total of 11 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP-MASTER.USERNAME has a total of 7 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_EMPTY_STRING_DIFFERS_FROM_NULL has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP.DATABASENAME has a total of 10 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS-ADHOC.HOSTNAME has a total of 12 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS-ADHOC.DATABASENAME has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Job.Filename.Name has a total of 19 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Step.Unique.Number has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.io.tmpdir has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.version has a total of 9 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_JNDI_ROOT has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.specification.name has a total of 34 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.awt.printerjob has a total of 22 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.os.patch.level has a total of 7 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.library.path has a total of 25 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vendor has a total of 18 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP.PASSWORD has a total of 43 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS.USERNAME has a total of 7 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable IS_AUDIT_TABLE has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.io.unicode.encoding has a total of 13 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Slave.Server.Name has a total of 16 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Kettle.Version has a total of 11 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable file.encoding.pkg has a total of 6 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_LOG_SIZE_LIMIT has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable PK_COLUMN has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.class.path has a total of 51 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Job.Filename.Directory has a total of 65 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.vendor has a total of 18 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable DESTINATIONSYSTEMFILENAMESUFFIX has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable NEW_REDSHIFT.JDBC.CONNECTION has a total of 129 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable NEW_REDSHIFT.PASSWORD has a total of 43 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable user.timezone has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Kettle.Build.Date has a total of 19 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable os.name has a total of 5 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.specification.version has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_PASSWORD has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.java.launcher has a total of 12 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable user.country has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable DESTINATIONSYSTEMFILTER has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Entry.Current.Directory has a total of 65 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable sun.cpu.endian has a total of 6 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable user.home has a total of 11 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Step.Partition.Number has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TABLE_OUT_OF_SYNC has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable user.language has a total of 2 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Job.Repository.Directory has a total of 65 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_USER has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable SOURCE_SYSTEM has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.awt.graphicsenv has a total of 30 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable JOB_TYPE has a total of 9 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable REDSHIFT.USERNAME has a total of 10 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Step.Name has a total of 9 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_PLUGIN_PACKAGES has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Step.Partition.ID has a total of 12 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable PAYMO_ENV has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable BATCH_COMMIT_SIZE has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP-MASTER.DATABASENAME has a total of 10 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable ADDITIONAL_AUDIT_TABLE_FIELDS has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.naming.factory.initial has a total of 34 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable SOURCESYSTEMFILENAMESUFFIX has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable path.separator has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP-MASTER.PORTNUMBER has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable os.version has a total of 27 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS-ADHOC.PASSWORD has a total of 43 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.endorsed.dirs has a total of 113 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.runtime.name has a total of 31 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable REDSHIFT.PASSWORD has a total of 43 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS.DATABASENAME has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.name has a total of 33 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable KETTLE_HOME has a total of 0 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TDS-ADHOC.USERNAME has a total of 7 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vendor.url.bug has a total of 35 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable OLTP-MASTER.HOSTNAME has a total of 11 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable user.dir has a total of 41 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable os.arch has a total of 5 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Slave.Transformation.Number has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable TIME_BASED_JOB_PICKUP_DELAY has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable org.osjava.sj.root has a total of 53 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Kettle.Build.Version has a total of 3 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.info has a total of 10 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.vm.version has a total of 10 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable NEW_REDSHIFT.USERNAME has a total of 10 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.ext.dirs has a total of 61 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable java.class.version has a total of 4 characters. 2023/11/23 06:15:16 - Safe to continue? - The variable Internal.Cluster.Size has a total of 1 characters. 2023/11/23 06:15:16 - Safe to continue? - (stdout) filename lock: sync_facts_TDS_S3_SCHEDULED.lock 2023/11/23 06:15:16 - Safe to continue? - (stdout) filename pid: sync_facts_TDS_S3_SCHEDULED.pid 2023/11/23 06:15:16 - Safe to continue? - (stdout) pid file does not exist, creating for kettle PID 7935 2023/11/23 06:15:16 - Safe to continue? - Command /tmp/kettle_abcc8acb-89c7-11ee-8bfd-57d772e11e34shell sync_facts TDS S3 SCHEDULED has finished 2023/11/23 06:15:16 - start_all_facts - Starting entry [Sync Time based Tables] 2023/11/23 06:15:16 - start_all_facts - exec(3, 0, Sync Time based Tables.0) 2023/11/23 06:15:16 - Sync Time based Tables - Starting job entry 2023/11/23 06:15:16 - Sync Time based Tables - Loading job from XML file : [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/start_sync_time_based_tables.kjb] 2023/11/23 06:15:16 - start_sync_time_based_tables - exec(4, 0, START.0) 2023/11/23 06:15:16 - START - Starting job entry 2023/11/23 06:15:16 - start_sync_time_based_tables - Starting entry [Time based job pickup delay] 2023/11/23 06:15:16 - start_sync_time_based_tables - exec(5, 0, Time based job pickup delay.0) 2023/11/23 06:15:16 - Time based job pickup delay - Starting job entry 2023/11/23 06:15:16 - Time based job pickup delay - Let's wait now for: 1.0 Seconds... 2023/11/23 06:15:18 - Time based job pickup delay - Wait time is reached. 2023/11/23 06:15:18 - start_sync_time_based_tables - Starting entry [Get heartbeat] 2023/11/23 06:15:18 - start_sync_time_based_tables - exec(6, 0, Get heartbeat.0) 2023/11/23 06:15:18 - Get heartbeat - Starting job entry 2023/11/23 06:15:18 - Get heartbeat - Opening transformation: [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/get_TDS_heartbeat_from_table.ktr] 2023/11/23 06:15:18 - Get heartbeat - Starting transformation...(file=${Internal.Job.Filename.Directory}/get_${SOURCESYSTEMFILENAMESUFFIX}_heartbeat_from_table.ktr, name=Get heartbeat, repinfo=null) 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation is pre-loaded. 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - nr of steps to run : 3 , nr of hops : 2 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Dispatching started for transformation [get_TDS_heartbeat_from_table] 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Nr of arguments detected:0 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - This is not a replay transformation 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - I found 3 different steps to launch. 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocating rowsets... 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocating rowsets for step 0 --> Read heartbeat from table 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - prevcopies = 1, nextcopies=1 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation allocated new rowset [Read heartbeat from table.0 - Rename value.0] 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocated 1 rowsets for step 0 --> Read heartbeat from table 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocating rowsets for step 1 --> Rename value 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - prevcopies = 1, nextcopies=1 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation allocated new rowset [Rename value.0 - Set DatacenterHeartbeat Variable.0] 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocated 2 rowsets for step 1 --> Rename value 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocating rowsets for step 2 --> Set DatacenterHeartbeat Variable 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocated 2 rowsets for step 2 --> Set DatacenterHeartbeat Variable 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Allocating Steps & StepData... 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation is about to allocate step [Read heartbeat from table] of type [TableInput] 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Step has nrcopies=1 2023/11/23 06:15:18 - Read heartbeat from table.0 - distribution activated 2023/11/23 06:15:18 - Read heartbeat from table.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:18 - Read heartbeat from table.0 - Step info: nrinput=0 nroutput=1 2023/11/23 06:15:18 - Read heartbeat from table.0 - output rel. is 1:1 2023/11/23 06:15:18 - Read heartbeat from table.0 - Found output rowset [Read heartbeat from table.0 - Rename value.0] 2023/11/23 06:15:18 - Read heartbeat from table.0 - Finished dispatching 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation has allocated a new step: [Read heartbeat from table].0 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation is about to allocate step [Rename value] of type [SelectValues] 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Step has nrcopies=1 2023/11/23 06:15:18 - Rename value.0 - distribution activated 2023/11/23 06:15:18 - Rename value.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:18 - Rename value.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:18 - Rename value.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:18 - Rename value.0 - input rel is 1:1 2023/11/23 06:15:18 - Rename value.0 - Found input rowset [Read heartbeat from table.0 - Rename value.0] 2023/11/23 06:15:18 - Rename value.0 - output rel. is 1:1 2023/11/23 06:15:18 - Rename value.0 - Found output rowset [Rename value.0 - Set DatacenterHeartbeat Variable.0] 2023/11/23 06:15:18 - Rename value.0 - Finished dispatching 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation has allocated a new step: [Rename value].0 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation is about to allocate step [Set DatacenterHeartbeat Variable] of type [SetVariable] 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Step has nrcopies=1 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - distribution activated 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - Step info: nrinput=1 nroutput=0 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - input rel is 1:1 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - Found input rowset [Rename value.0 - Set DatacenterHeartbeat Variable.0] 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - Finished dispatching 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Transformation has allocated a new step: [Set DatacenterHeartbeat Variable].0 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - This transformation can be replayed with replay date: 2023/11/23 06:15:18 2023/11/23 06:15:18 - get_TDS_heartbeat_from_table - Initialising 3 steps... 2023/11/23 06:15:18 - Read heartbeat from table.0 - Released server socket on port 0 2023/11/23 06:15:18 - Rename value.0 - Released server socket on port 0 2023/11/23 06:15:18 - TDS_CONNECTION - New database connection defined 2023/11/23 06:15:18 - Set DatacenterHeartbeat Variable.0 - Released server socket on port 0 2023/11/23 06:15:19 - Read heartbeat from table.0 - Connected to database... 2023/11/23 06:15:19 - get_TDS_heartbeat_from_table - Step [Read heartbeat from table.0] initialized flawlessly. 2023/11/23 06:15:19 - get_TDS_heartbeat_from_table - Step [Rename value.0] initialized flawlessly. 2023/11/23 06:15:19 - get_TDS_heartbeat_from_table - Step [Set DatacenterHeartbeat Variable.0] initialized flawlessly. 2023/11/23 06:15:19 - Read heartbeat from table.0 - Starting to run... 2023/11/23 06:15:19 - Read heartbeat from table.0 - SQL query : select 2023/11/23 06:15:19 - Read heartbeat from table.0 - DATE_FORMAT(min(ts), '%Y-%m-%d %H:%i:%s') as minHeartbeat 2023/11/23 06:15:19 - Read heartbeat from table.0 - from 2023/11/23 06:15:19 - Read heartbeat from table.0 - heartbeat.heartbeat; 2023/11/23 06:15:19 - Rename value.0 - Starting to run... 2023/11/23 06:15:19 - Set DatacenterHeartbeat Variable.0 - Starting to run... 2023/11/23 06:15:19 - get_TDS_heartbeat_from_table - Transformation has allocated 3 threads and 2 rowsets. 2023/11/23 06:15:19 - Read heartbeat from table.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Read heartbeat from table.0 - Finished reading query, closing connection 2023/11/23 06:15:19 - TDS_CONNECTION - Connection to database closed! 2023/11/23 06:15:19 - Read heartbeat from table.0 - Finished processing (I=1, O=0, R=0, W=1, U=0, E=0) 2023/11/23 06:15:19 - Rename value.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Rename value.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - Set DatacenterHeartbeat Variable.0 - Setting environment variables... 2023/11/23 06:15:19 - Set DatacenterHeartbeat Variable.0 - Set variable DATACENTERHEARTBEAT to value [2023-11-23 06:15:19] 2023/11/23 06:15:19 - Set DatacenterHeartbeat Variable.0 - Finished after 1 rows. 2023/11/23 06:15:19 - Set DatacenterHeartbeat Variable.0 - Signaling 'output done' to 0 output rowsets. 2023/11/23 06:15:19 - Set DatacenterHeartbeat Variable.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - start_sync_time_based_tables - Starting entry [Get Sync Jobs] 2023/11/23 06:15:19 - start_sync_time_based_tables - exec(7, 0, Get Sync Jobs.0) 2023/11/23 06:15:19 - Get Sync Jobs - Starting job entry 2023/11/23 06:15:19 - Get Sync Jobs - Opening transformation: [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/get_sync_jobs.ktr] 2023/11/23 06:15:19 - Get Sync Jobs - Starting transformation...(file=${Internal.Job.Filename.Directory}/get_sync_jobs.ktr, name=Get Sync Jobs, repinfo=null) 2023/11/23 06:15:19 - get_sync_jobs - Transformation is pre-loaded. 2023/11/23 06:15:19 - get_sync_jobs - nr of steps to run : 4 , nr of hops : 3 2023/11/23 06:15:19 - get_sync_jobs - Dispatching started for transformation [get_sync_jobs] 2023/11/23 06:15:19 - get_sync_jobs - Nr of arguments detected:0 2023/11/23 06:15:19 - get_sync_jobs - This is not a replay transformation 2023/11/23 06:15:19 - get_sync_jobs - I found 4 different steps to launch. 2023/11/23 06:15:19 - get_sync_jobs - Allocating rowsets... 2023/11/23 06:15:19 - get_sync_jobs - Allocating rowsets for step 0 --> Rename column name 2023/11/23 06:15:19 - get_sync_jobs - prevcopies = 1, nextcopies=1 2023/11/23 06:15:19 - get_sync_jobs - Transformation allocated new rowset [Rename column name.0 - Copy rows to result.0] 2023/11/23 06:15:19 - get_sync_jobs - Allocated 1 rowsets for step 0 --> Rename column name 2023/11/23 06:15:19 - get_sync_jobs - Allocating rowsets for step 1 --> Copy rows to result 2023/11/23 06:15:19 - get_sync_jobs - Allocated 1 rowsets for step 1 --> Copy rows to result 2023/11/23 06:15:19 - get_sync_jobs - Allocating rowsets for step 2 --> Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE) 2023/11/23 06:15:19 - get_sync_jobs - prevcopies = 1, nextcopies=1 2023/11/23 06:15:19 - get_sync_jobs - Transformation allocated new rowset [Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Get synchronization jobs.0] 2023/11/23 06:15:19 - get_sync_jobs - Allocated 2 rowsets for step 2 --> Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE) 2023/11/23 06:15:19 - get_sync_jobs - Allocating rowsets for step 3 --> Get synchronization jobs 2023/11/23 06:15:19 - get_sync_jobs - prevcopies = 1, nextcopies=1 2023/11/23 06:15:19 - get_sync_jobs - Transformation allocated new rowset [Get synchronization jobs.0 - Rename column name.0] 2023/11/23 06:15:19 - get_sync_jobs - Allocated 3 rowsets for step 3 --> Get synchronization jobs 2023/11/23 06:15:19 - get_sync_jobs - Allocating Steps & StepData... 2023/11/23 06:15:19 - get_sync_jobs - Transformation is about to allocate step [Rename column name] of type [SelectValues] 2023/11/23 06:15:19 - get_sync_jobs - Step has nrcopies=1 2023/11/23 06:15:19 - Rename column name.0 - distribution de-activated 2023/11/23 06:15:19 - Rename column name.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Rename column name.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:19 - Rename column name.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:19 - Rename column name.0 - input rel is 1:1 2023/11/23 06:15:19 - Rename column name.0 - Found input rowset [Get synchronization jobs.0 - Rename column name.0] 2023/11/23 06:15:19 - Rename column name.0 - output rel. is 1:1 2023/11/23 06:15:19 - Rename column name.0 - Found output rowset [Rename column name.0 - Copy rows to result.0] 2023/11/23 06:15:19 - Rename column name.0 - Finished dispatching 2023/11/23 06:15:19 - get_sync_jobs - Transformation has allocated a new step: [Rename column name].0 2023/11/23 06:15:19 - get_sync_jobs - Transformation is about to allocate step [Copy rows to result] of type [RowsToResult] 2023/11/23 06:15:19 - get_sync_jobs - Step has nrcopies=1 2023/11/23 06:15:19 - Copy rows to result.0 - distribution activated 2023/11/23 06:15:19 - Copy rows to result.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Copy rows to result.0 - Step info: nrinput=1 nroutput=0 2023/11/23 06:15:19 - Copy rows to result.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:19 - Copy rows to result.0 - input rel is 1:1 2023/11/23 06:15:19 - Copy rows to result.0 - Found input rowset [Rename column name.0 - Copy rows to result.0] 2023/11/23 06:15:19 - Copy rows to result.0 - Finished dispatching 2023/11/23 06:15:19 - get_sync_jobs - Transformation has allocated a new step: [Copy rows to result].0 2023/11/23 06:15:19 - get_sync_jobs - Transformation is about to allocate step [Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE)] of type [GetVariable] 2023/11/23 06:15:19 - get_sync_jobs - Step has nrcopies=1 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - distribution de-activated 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Step info: nrinput=0 nroutput=1 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - output rel. is 1:1 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Found output rowset [Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Get synchronization jobs.0] 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Finished dispatching 2023/11/23 06:15:19 - get_sync_jobs - Transformation has allocated a new step: [Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE)].0 2023/11/23 06:15:19 - get_sync_jobs - Transformation is about to allocate step [Get synchronization jobs] of type [TableInput] 2023/11/23 06:15:19 - get_sync_jobs - Step has nrcopies=1 2023/11/23 06:15:19 - Get synchronization jobs.0 - distribution activated 2023/11/23 06:15:19 - Get synchronization jobs.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Get synchronization jobs.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:19 - Get synchronization jobs.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:19 - Get synchronization jobs.0 - input rel is 1:1 2023/11/23 06:15:19 - Get synchronization jobs.0 - Found input rowset [Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Get synchronization jobs.0] 2023/11/23 06:15:19 - Get synchronization jobs.0 - output rel. is 1:1 2023/11/23 06:15:19 - Get synchronization jobs.0 - Found output rowset [Get synchronization jobs.0 - Rename column name.0] 2023/11/23 06:15:19 - Get synchronization jobs.0 - Finished dispatching 2023/11/23 06:15:19 - get_sync_jobs - Transformation has allocated a new step: [Get synchronization jobs].0 2023/11/23 06:15:19 - get_sync_jobs - This transformation can be replayed with replay date: 2023/11/23 06:15:19 2023/11/23 06:15:19 - get_sync_jobs - Initialising 4 steps... 2023/11/23 06:15:19 - Rename column name.0 - Released server socket on port 0 2023/11/23 06:15:19 - Copy rows to result.0 - Released server socket on port 0 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Released server socket on port 0 2023/11/23 06:15:19 - Get synchronization jobs.0 - Released server socket on port 0 2023/11/23 06:15:19 - TDS_CONNECTION - New database connection defined 2023/11/23 06:15:19 - Get synchronization jobs.0 - Connected to database... 2023/11/23 06:15:19 - get_sync_jobs - Step [Rename column name.0] initialized flawlessly. 2023/11/23 06:15:19 - get_sync_jobs - Step [Copy rows to result.0] initialized flawlessly. 2023/11/23 06:15:19 - get_sync_jobs - Step [Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0] initialized flawlessly. 2023/11/23 06:15:19 - get_sync_jobs - Step [Get synchronization jobs.0] initialized flawlessly. 2023/11/23 06:15:19 - Rename column name.0 - Starting to run... 2023/11/23 06:15:19 - Copy rows to result.0 - Starting to run... 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Starting to run... 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - field [DATACENTERHEARTBEAT] has value [2023-11-23 06:15:19] 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - field [DESTINATIONSYSTEMFILTER] has value ['S3'] 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - field [JOBTYPEFILTER] has value [SCHEDULED] 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Get Variables (HEARTBEAT, DEST.SYSTEM, JOBTYPE).0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - Get synchronization jobs.0 - Starting to run... 2023/11/23 06:15:19 - Get synchronization jobs.0 - SQL query : -- select max 20 jobs per table 2023/11/23 06:15:19 - Get synchronization jobs.0 - select 2023/11/23 06:15:19 - Get synchronization jobs.0 - s.id 2023/11/23 06:15:19 - Get synchronization jobs.0 - ,c.id AS etlTableConfigId 2023/11/23 06:15:19 - Get synchronization jobs.0 - ,c.sourceDbName, c.destinationDbName ,c.destinationTableName, c.dateColumn 2023/11/23 06:15:19 - Get synchronization jobs.0 - ,DATE_FORMAT(s.rangeStartDate, '%Y-%m-%d %H:%i:%s') as rangeStartDate 2023/11/23 06:15:19 - Get synchronization jobs.0 - ,DATE_FORMAT(s.rangeEndDate, '%Y-%m-%d %H:%i:%s') as rangeEndDate 2023/11/23 06:15:19 - Get synchronization jobs.0 - ,@rank := if(@etlTableConfigId = c.id, @rank+1, 1) as rank 2023/11/23 06:15:19 - Get synchronization jobs.0 - ,@etlTableConfigId := c.id 2023/11/23 06:15:19 - Get synchronization jobs.0 - from 2023/11/23 06:15:19 - Get synchronization jobs.0 - (select @rank := 0, @etlTableConfigId := 0 ) a 2023/11/23 06:15:19 - Get synchronization jobs.0 - inner join etl_job_schedule s force index (ix_tableConfigId_jobTypeId_rangeStartDate) 2023/11/23 06:15:19 - Get synchronization jobs.0 - inner join etl_table_config c on 2023/11/23 06:15:19 - Get synchronization jobs.0 - s.etlTableConfigId = c.id 2023/11/23 06:15:19 - Get synchronization jobs.0 - inner join etl_job_type jt on 2023/11/23 06:15:19 - Get synchronization jobs.0 - s.etlJobTypeId = jt.id 2023/11/23 06:15:19 - Get synchronization jobs.0 - where 2023/11/23 06:15:19 - Get synchronization jobs.0 - s.shardId = 17 and 2023/11/23 06:15:19 - Get synchronization jobs.0 - s.rangeEndDate < LEAST(now(), '2023-11-23 06:15:19') - interval c.jobDelayInMin minute 2023/11/23 06:15:19 - Get synchronization jobs.0 - and s.state is NULL 2023/11/23 06:15:19 - Get synchronization jobs.0 - and c.syncOn = 1 2023/11/23 06:15:19 - Get synchronization jobs.0 - and c.sourceSystemName in ('TDS') 2023/11/23 06:15:19 - Get synchronization jobs.0 - and c.destinationSystemName in ('S3') 2023/11/23 06:15:19 - Get synchronization jobs.0 - and jt.jobType = 'SCHEDULED' 2023/11/23 06:15:19 - Get synchronization jobs.0 - and (c.parentConfigId is null 2023/11/23 06:15:19 - Get synchronization jobs.0 - or (jt.jobType != 'SCHEDULED' and jt.jobType != 'RESCHEDULED') 2023/11/23 06:15:19 - Get synchronization jobs.0 - or usf_checkEtlDependency(c.parentConfigId, s.rangeStartDate, s.rangeEndDate) = 1) 2023/11/23 06:15:19 - Get synchronization jobs.0 - 2023/11/23 06:15:19 - Get synchronization jobs.0 - group by c.id, s.etlJobTypeId, s.rangeStartDate 2023/11/23 06:15:19 - Get synchronization jobs.0 - having rank <= 20 2023/11/23 06:15:19 - Get synchronization jobs.0 - order by c.id, s.etlJobTypeId, s.rangeStartDate 2023/11/23 06:15:19 - Get synchronization jobs.0 - ; 2023/11/23 06:15:19 - get_sync_jobs - Transformation has allocated 4 threads and 3 rowsets. 2023/11/23 06:15:19 - Get synchronization jobs.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Get synchronization jobs.0 - Finished reading query, closing connection 2023/11/23 06:15:19 - TDS_CONNECTION - Connection to database closed! 2023/11/23 06:15:19 - Get synchronization jobs.0 - Finished processing (I=1, O=0, R=0, W=1, U=0, E=0) 2023/11/23 06:15:19 - Rename column name.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Rename column name.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - Copy rows to result.0 - Signaling 'output done' to 0 output rowsets. 2023/11/23 06:15:19 - Copy rows to result.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - start_sync_time_based_tables - Starting entry [Sync Time Based Tables] 2023/11/23 06:15:19 - start_sync_time_based_tables - exec(8, 0, Sync Time Based Tables.0) 2023/11/23 06:15:19 - Sync Time Based Tables - Starting job entry 2023/11/23 06:15:19 - Sync Time Based Tables - Loading job from XML file : [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync_one_table_time_based.kjb] 2023/11/23 06:15:19 - sync_one_table_time_based - exec(9, 0, START.0) 2023/11/23 06:15:19 - START - Starting job entry 2023/11/23 06:15:19 - sync_one_table_time_based - Starting entry [Write synchronizing table to log] 2023/11/23 06:15:19 - sync_one_table_time_based - exec(10, 0, Write synchronizing table to log.0) 2023/11/23 06:15:19 - Write synchronizing table to log - Starting job entry 2023/11/23 06:15:19 - Start synchronizing table - bundling > boku-dw-ireland-prod: bundlinguseraccount (JOB-ID: 144042561) 2023/11/23 06:15:19 - sync_one_table_time_based - Starting entry [Update Job: EXECUTING] 2023/11/23 06:15:19 - sync_one_table_time_based - exec(11, 0, Update Job: EXECUTING.0) 2023/11/23 06:15:19 - Update Job: EXECUTING - Starting job entry 2023/11/23 06:15:19 - TDS_CONNECTION - New database connection defined 2023/11/23 06:15:19 - Update Job: EXECUTING - Running SQL :UPDATE etl_job_schedule SET startDate = now(), state = 'EXECUTING' where id = 144042561; 2023/11/23 06:15:19 - TDS_CONNECTION - launch DDL statement: 2023/11/23 06:15:19 - TDS_CONNECTION - UPDATE etl_job_schedule SET startDate = now(), state = 'EXECUTING' where id = 144042561 2023/11/23 06:15:19 - TDS_CONNECTION - 1 statement executed 2023/11/23 06:15:19 - TDS_CONNECTION - Connection to database closed! 2023/11/23 06:15:19 - sync_one_table_time_based - Starting entry [Check Timebase Facts File Exists] 2023/11/23 06:15:19 - sync_one_table_time_based - exec(12, 0, Check Timebase Facts File Exists.0) 2023/11/23 06:15:19 - Check Timebase Facts File Exists - Starting job entry 2023/11/23 06:15:19 - Check Timebase Facts File Exists - File [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/sync_time_based_table_TDS_S3_bundlinguseraccount.kjb] exists. 2023/11/23 06:15:19 - sync_one_table_time_based - Starting entry [Synchronize table] 2023/11/23 06:15:19 - sync_one_table_time_based - exec(13, 0, Synchronize table.0) 2023/11/23 06:15:19 - Synchronize table - Starting job entry 2023/11/23 06:15:19 - Synchronize table - Loading job from XML file : [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/sync_time_based_table_TDS_S3_bundlinguseraccount.kjb] 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - exec(14, 0, START.0) 2023/11/23 06:15:19 - START - Starting job entry 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - Starting entry [Simple evaluation] 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - exec(15, 0, Simple evaluation.0) 2023/11/23 06:15:19 - Simple evaluation - Starting job entry 2023/11/23 06:15:19 - Simple evaluation - Checking variable [${JOBTYPEFILTER}] ... 2023/11/23 06:15:19 - Simple evaluation - Value to evaluate is SCHEDULED 2023/11/23 06:15:19 - Simple evaluation - Comparing incoming value [SCHEDULED] with value [S3_LOAD]... 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - Starting entry [Build S3 file location] 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - exec(16, 0, Build S3 file location.0) 2023/11/23 06:15:19 - Build S3 file location - Starting job entry 2023/11/23 06:15:19 - Build S3 file location - Opening transformation: [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/utils/build_S3_file_location.ktr] 2023/11/23 06:15:19 - Build S3 file location - Starting transformation...(file=${Internal.Job.Filename.Directory}/utils/build_S3_file_location.ktr, name=Build S3 file location, repinfo=null) 2023/11/23 06:15:19 - Build S3 file location - Using run configuration [Pentaho local] 2023/11/23 06:15:19 - build_S3_file_location - Transformation is pre-loaded. 2023/11/23 06:15:19 - build_S3_file_location - nr of steps to run : 4 , nr of hops : 3 2023/11/23 06:15:19 - build_S3_file_location - Dispatching started for transformation [build_S3_file_location] 2023/11/23 06:15:19 - build_S3_file_location - Nr of arguments detected:0 2023/11/23 06:15:19 - build_S3_file_location - This is not a replay transformation 2023/11/23 06:15:19 - build_S3_file_location - I found 4 different steps to launch. 2023/11/23 06:15:19 - build_S3_file_location - Allocating rowsets... 2023/11/23 06:15:19 - build_S3_file_location - Allocating rowsets for step 0 --> Build AWS S3 file location and Redshift Load Query 2023/11/23 06:15:19 - build_S3_file_location - prevcopies = 1, nextcopies=1 2023/11/23 06:15:19 - build_S3_file_location - Transformation allocated new rowset [Build AWS S3 file location and Redshift Load Query.0 - Select AWS S3 file location.0] 2023/11/23 06:15:19 - build_S3_file_location - Allocated 1 rowsets for step 0 --> Build AWS S3 file location and Redshift Load Query 2023/11/23 06:15:19 - build_S3_file_location - Allocating rowsets for step 1 --> Select AWS S3 file location 2023/11/23 06:15:19 - build_S3_file_location - prevcopies = 1, nextcopies=1 2023/11/23 06:15:19 - build_S3_file_location - Transformation allocated new rowset [Select AWS S3 file location.0 - Set AWS S3 file location.0] 2023/11/23 06:15:19 - build_S3_file_location - Allocated 2 rowsets for step 1 --> Select AWS S3 file location 2023/11/23 06:15:19 - build_S3_file_location - Allocating rowsets for step 2 --> Set AWS S3 file location 2023/11/23 06:15:19 - build_S3_file_location - Allocated 2 rowsets for step 2 --> Set AWS S3 file location 2023/11/23 06:15:19 - build_S3_file_location - Allocating rowsets for step 3 --> Generate Single Row 2023/11/23 06:15:19 - build_S3_file_location - prevcopies = 1, nextcopies=1 2023/11/23 06:15:19 - build_S3_file_location - Transformation allocated new rowset [Generate Single Row.0 - Build AWS S3 file location and Redshift Load Query.0] 2023/11/23 06:15:19 - build_S3_file_location - Allocated 3 rowsets for step 3 --> Generate Single Row 2023/11/23 06:15:19 - build_S3_file_location - Allocating Steps & StepData... 2023/11/23 06:15:19 - build_S3_file_location - Transformation is about to allocate step [Build AWS S3 file location and Redshift Load Query] of type [ScriptValueMod] 2023/11/23 06:15:19 - build_S3_file_location - Step has nrcopies=1 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - distribution de-activated 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - input rel is 1:1 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Found input rowset [Generate Single Row.0 - Build AWS S3 file location and Redshift Load Query.0] 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - output rel. is 1:1 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Found output rowset [Build AWS S3 file location and Redshift Load Query.0 - Select AWS S3 file location.0] 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Finished dispatching 2023/11/23 06:15:19 - build_S3_file_location - Transformation has allocated a new step: [Build AWS S3 file location and Redshift Load Query].0 2023/11/23 06:15:19 - build_S3_file_location - Transformation is about to allocate step [Select AWS S3 file location] of type [SelectValues] 2023/11/23 06:15:19 - build_S3_file_location - Step has nrcopies=1 2023/11/23 06:15:19 - Select AWS S3 file location.0 - distribution de-activated 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:19 - Select AWS S3 file location.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:19 - Select AWS S3 file location.0 - input rel is 1:1 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Found input rowset [Build AWS S3 file location and Redshift Load Query.0 - Select AWS S3 file location.0] 2023/11/23 06:15:19 - Select AWS S3 file location.0 - output rel. is 1:1 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Found output rowset [Select AWS S3 file location.0 - Set AWS S3 file location.0] 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Finished dispatching 2023/11/23 06:15:19 - build_S3_file_location - Transformation has allocated a new step: [Select AWS S3 file location].0 2023/11/23 06:15:19 - build_S3_file_location - Transformation is about to allocate step [Set AWS S3 file location] of type [SetVariable] 2023/11/23 06:15:19 - build_S3_file_location - Step has nrcopies=1 2023/11/23 06:15:19 - Set AWS S3 file location.0 - distribution activated 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Step info: nrinput=1 nroutput=0 2023/11/23 06:15:19 - Set AWS S3 file location.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:19 - Set AWS S3 file location.0 - input rel is 1:1 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Found input rowset [Select AWS S3 file location.0 - Set AWS S3 file location.0] 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Finished dispatching 2023/11/23 06:15:19 - build_S3_file_location - Transformation has allocated a new step: [Set AWS S3 file location].0 2023/11/23 06:15:19 - build_S3_file_location - Transformation is about to allocate step [Generate Single Row] of type [RowGenerator] 2023/11/23 06:15:19 - build_S3_file_location - Step has nrcopies=1 2023/11/23 06:15:19 - Generate Single Row.0 - distribution de-activated 2023/11/23 06:15:19 - Generate Single Row.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:19 - Generate Single Row.0 - Step info: nrinput=0 nroutput=1 2023/11/23 06:15:19 - Generate Single Row.0 - output rel. is 1:1 2023/11/23 06:15:19 - Generate Single Row.0 - Found output rowset [Generate Single Row.0 - Build AWS S3 file location and Redshift Load Query.0] 2023/11/23 06:15:19 - Generate Single Row.0 - Finished dispatching 2023/11/23 06:15:19 - build_S3_file_location - Transformation has allocated a new step: [Generate Single Row].0 2023/11/23 06:15:19 - build_S3_file_location - This transformation can be replayed with replay date: 2023/11/23 06:15:19 2023/11/23 06:15:19 - build_S3_file_location - Initialising 4 steps... 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Released server socket on port 0 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Released server socket on port 0 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Released server socket on port 0 2023/11/23 06:15:19 - Generate Single Row.0 - Released server socket on port 0 2023/11/23 06:15:19 - build_S3_file_location - Step [Build AWS S3 file location and Redshift Load Query.0] initialized flawlessly. 2023/11/23 06:15:19 - build_S3_file_location - Step [Select AWS S3 file location.0] initialized flawlessly. 2023/11/23 06:15:19 - build_S3_file_location - Step [Set AWS S3 file location.0] initialized flawlessly. 2023/11/23 06:15:19 - build_S3_file_location - Step [Generate Single Row.0] initialized flawlessly. 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Starting to run... 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Starting to run... 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Starting to run... 2023/11/23 06:15:19 - Generate Single Row.0 - Starting to run... 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - This script is using 0 values from the input stream(s) 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Optimization level set to 9. 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - No starting Script found! 2023/11/23 06:15:19 - build_S3_file_location - Transformation has allocated 4 threads and 3 rowsets. 2023/11/23 06:15:19 - Generate Single Row.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Generate Single Row.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0) 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - No tran_Status found. Transformation status checking not available. 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - No end Script found! 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Build AWS S3 file location and Redshift Load Query.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Signaling 'output done' to 1 output rowsets. 2023/11/23 06:15:19 - Select AWS S3 file location.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Setting environment variables... 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Set variable S3_STORAGE_FILE_LOCATION to value [s3://s3/boku-dw-ireland/prod/tables/TDS/billing_cdr_historical/2014/01/20140101/billing_cdr_historical_20140101000000_20140101001000] 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Finished after 1 rows. 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Signaling 'output done' to 0 output rowsets. 2023/11/23 06:15:19 - Set AWS S3 file location.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0) 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - Starting entry [Process S3_bundlinguseraccount] 2023/11/23 06:15:19 - sync_time_based_table_TDS_S3_bundlinguseraccount - exec(17, 0, Process S3_bundlinguseraccount.0) 2023/11/23 06:15:19 - Process S3_bundlinguseraccount - Starting job entry 2023/11/23 06:15:19 - Process S3_bundlinguseraccount - Opening transformation: [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/TDS_S3_bundlinguseraccount/sync_time_based_table_S3_bundlinguseraccount.ktr] 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - Starting transformation...(file=${Internal.Job.Filename.Directory}/${SOURCESYSTEMFILENAMESUFFIX}_${DESTINATIONSYSTEMFILENAMESUFFIX}_${DESTINATION_TABLE_NAME}/sync_time_based_table_${DESTINATIONSYSTEMFILENAMESUFFIX}_${DESTINATION_TABLE_NAME}.ktr, name=Process S3_bundlinguseraccount, repinfo=null) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - Using run configuration [Pentaho local] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is pre-loaded. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - nr of steps to run : 10 , nr of hops : 8 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Dispatching started for transformation [sync_time_based_table_S3_bundlinguseraccount] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Nr of arguments detected:0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - This is not a replay transformation 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - I found 10 different steps to launch. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets... 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 0 --> Get processed rows by Insert / Update 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Get processed rows by Insert / Update.0 - Get JOB_ID.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 1 rowsets for step 0 --> Get processed rows by Insert / Update 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 1 --> Get JOB_ID 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Get JOB_ID.0 - Update job with processedRows.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 2 rowsets for step 1 --> Get JOB_ID 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 2 --> Log Params 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Log Params.0 - AWS S3 File Output.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 3 rowsets for step 2 --> Log Params 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 3 --> AWS S3 File Output 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 3 rowsets for step 3 --> AWS S3 File Output 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 4 --> Select processed record count 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Select processed record count.0 - Copy rows to result.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 4 rowsets for step 4 --> Select processed record count 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 5 --> Copy rows to result 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 4 rowsets for step 5 --> Copy rows to result 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 6 --> Read data to synchronize 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Read data to synchronize.0 - Mask MSISDN.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 5 rowsets for step 6 --> Read data to synchronize 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 7 --> Mask MSISDN 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Mask MSISDN.0 - Select values.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 6 rowsets for step 7 --> Mask MSISDN 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 8 --> Select values 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Select values.0 - Log Params.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 7 rowsets for step 8 --> Select values 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating rowsets for step 9 --> Update job with processedRows 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - prevcopies = 1, nextcopies=1 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation allocated new rowset [Update job with processedRows.0 - Select processed record count.0] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocated 8 rowsets for step 9 --> Update job with processedRows 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Allocating Steps & StepData... 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Get processed rows by Insert / Update] of type [StepsMetrics] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - distribution de-activated 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - Step info: nrinput=0 nroutput=1 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - output rel. is 1:1 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - Found output rowset [Get processed rows by Insert / Update.0 - Get JOB_ID.0] 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Get processed rows by Insert / Update].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Get JOB_ID] of type [GetVariable] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Get JOB_ID.0 - distribution activated 2023/11/23 06:15:20 - Get JOB_ID.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Get JOB_ID.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:20 - Get JOB_ID.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Get JOB_ID.0 - input rel is 1:1 2023/11/23 06:15:20 - Get JOB_ID.0 - Found input rowset [Get processed rows by Insert / Update.0 - Get JOB_ID.0] 2023/11/23 06:15:20 - Get JOB_ID.0 - output rel. is 1:1 2023/11/23 06:15:20 - Get JOB_ID.0 - Found output rowset [Get JOB_ID.0 - Update job with processedRows.0] 2023/11/23 06:15:20 - Get JOB_ID.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Get JOB_ID].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Log Params] of type [WriteToLog] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Log Params.0 - distribution activated 2023/11/23 06:15:20 - Log Params.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Log Params.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:20 - Log Params.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Log Params.0 - input rel is 1:1 2023/11/23 06:15:20 - Log Params.0 - Found input rowset [Select values.0 - Log Params.0] 2023/11/23 06:15:20 - Log Params.0 - output rel. is 1:1 2023/11/23 06:15:20 - Log Params.0 - Found output rowset [Log Params.0 - AWS S3 File Output.0] 2023/11/23 06:15:20 - Log Params.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Log Params].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [AWS S3 File Output] of type [S3FileOutputPlugin] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - AWS S3 File Output.0 - distribution activated 2023/11/23 06:15:20 - AWS S3 File Output.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - AWS S3 File Output.0 - Step info: nrinput=1 nroutput=0 2023/11/23 06:15:20 - AWS S3 File Output.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - AWS S3 File Output.0 - input rel is 1:1 2023/11/23 06:15:20 - AWS S3 File Output.0 - Found input rowset [Log Params.0 - AWS S3 File Output.0] 2023/11/23 06:15:20 - AWS S3 File Output.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [AWS S3 File Output].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Select processed record count] of type [SelectValues] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Select processed record count.0 - distribution activated 2023/11/23 06:15:20 - Select processed record count.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Select processed record count.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:20 - Select processed record count.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Select processed record count.0 - input rel is 1:1 2023/11/23 06:15:20 - Select processed record count.0 - Found input rowset [Update job with processedRows.0 - Select processed record count.0] 2023/11/23 06:15:20 - Select processed record count.0 - output rel. is 1:1 2023/11/23 06:15:20 - Select processed record count.0 - Found output rowset [Select processed record count.0 - Copy rows to result.0] 2023/11/23 06:15:20 - Select processed record count.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Select processed record count].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Copy rows to result] of type [RowsToResult] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Copy rows to result.0 - distribution activated 2023/11/23 06:15:20 - Copy rows to result.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Copy rows to result.0 - Step info: nrinput=1 nroutput=0 2023/11/23 06:15:20 - Copy rows to result.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Copy rows to result.0 - input rel is 1:1 2023/11/23 06:15:20 - Copy rows to result.0 - Found input rowset [Select processed record count.0 - Copy rows to result.0] 2023/11/23 06:15:20 - Copy rows to result.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Copy rows to result].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Read data to synchronize] of type [TableInput] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Read data to synchronize.0 - distribution de-activated 2023/11/23 06:15:20 - Read data to synchronize.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Read data to synchronize.0 - Step info: nrinput=0 nroutput=1 2023/11/23 06:15:20 - Read data to synchronize.0 - output rel. is 1:1 2023/11/23 06:15:20 - Read data to synchronize.0 - Found output rowset [Read data to synchronize.0 - Mask MSISDN.0] 2023/11/23 06:15:20 - Read data to synchronize.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Read data to synchronize].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Mask MSISDN] of type [UserDefinedJavaClass] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Mask MSISDN.0 - distribution activated 2023/11/23 06:15:20 - Mask MSISDN.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Mask MSISDN.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:20 - Mask MSISDN.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Mask MSISDN.0 - input rel is 1:1 2023/11/23 06:15:20 - Mask MSISDN.0 - Found input rowset [Read data to synchronize.0 - Mask MSISDN.0] 2023/11/23 06:15:20 - Mask MSISDN.0 - output rel. is 1:1 2023/11/23 06:15:20 - Mask MSISDN.0 - Found output rowset [Mask MSISDN.0 - Select values.0] 2023/11/23 06:15:20 - Mask MSISDN.0 - Finished dispatching 2023/11/23 06:15:20 - Mask MSISDN.0 - ERROR (version 9.4.0.0-343, build 0.0 from 2022-11-08 07.50.27 by buildguy) : Error initializing UserDefinedJavaClass: 2023/11/23 06:15:20 - Mask MSISDN.0 - ERROR (version 9.4.0.0-343, build 0.0 from 2022-11-08 07.50.27 by buildguy) : org.pentaho.di.core.exception.KettleException: 2023/11/23 06:15:20 - Mask MSISDN.0 - null 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Mask MSISDN].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Select values] of type [SelectValues] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Select values.0 - distribution activated 2023/11/23 06:15:20 - Select values.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Select values.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:20 - Select values.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Select values.0 - input rel is 1:1 2023/11/23 06:15:20 - Select values.0 - Found input rowset [Mask MSISDN.0 - Select values.0] 2023/11/23 06:15:20 - Select values.0 - output rel. is 1:1 2023/11/23 06:15:20 - Select values.0 - Found output rowset [Select values.0 - Log Params.0] 2023/11/23 06:15:20 - Select values.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Select values].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation is about to allocate step [Update job with processedRows] of type [Update] 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step has nrcopies=1 2023/11/23 06:15:20 - Update job with processedRows.0 - distribution activated 2023/11/23 06:15:20 - Update job with processedRows.0 - Starting allocation of buffers & new threads... 2023/11/23 06:15:20 - Update job with processedRows.0 - Step info: nrinput=1 nroutput=1 2023/11/23 06:15:20 - Update job with processedRows.0 - !BaseStep.Log.GotPreviousStep! 2023/11/23 06:15:20 - Update job with processedRows.0 - input rel is 1:1 2023/11/23 06:15:20 - Update job with processedRows.0 - Found input rowset [Get JOB_ID.0 - Update job with processedRows.0] 2023/11/23 06:15:20 - Update job with processedRows.0 - output rel. is 1:1 2023/11/23 06:15:20 - Update job with processedRows.0 - Found output rowset [Update job with processedRows.0 - Select processed record count.0] 2023/11/23 06:15:20 - Update job with processedRows.0 - Finished dispatching 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Transformation has allocated a new step: [Update job with processedRows].0 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - This transformation can be replayed with replay date: 2023/11/23 06:15:20 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Initialising 10 steps... 2023/11/23 06:15:20 - Get processed rows by Insert / Update.0 - Released server socket on port 0 2023/11/23 06:15:20 - Get JOB_ID.0 - Released server socket on port 0 2023/11/23 06:15:20 - Log Params.0 - Released server socket on port 0 2023/11/23 06:15:20 - Select processed record count.0 - Released server socket on port 0 2023/11/23 06:15:20 - Copy rows to result.0 - Released server socket on port 0 2023/11/23 06:15:20 - Read data to synchronize.0 - Released server socket on port 0 2023/11/23 06:15:20 - Select values.0 - Released server socket on port 0 2023/11/23 06:15:20 - Mask MSISDN.0 - ERROR (version 9.4.0.0-343, build 0.0 from 2022-11-08 07.50.27 by buildguy) : Error initializing step [Mask MSISDN] 2023/11/23 06:15:20 - Update job with processedRows.0 - Released server socket on port 0 2023/11/23 06:15:20 - TDS_CONNECTION - New database connection defined 2023/11/23 06:15:20 - TDS_CONNECTION - New database connection defined 2023/11/23 06:15:20 - AWS S3 File Output.0 - Released server socket on port 0 2023/11/23 06:15:20 - Read data to synchronize.0 - Connected to database... 2023/11/23 06:15:20 - Update job with processedRows.0 - Connected to database... 2023/11/23 06:15:20 - TDS_CONNECTION - Auto commit off 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Get processed rows by Insert / Update.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Get JOB_ID.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Log Params.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [AWS S3 File Output.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Select processed record count.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Copy rows to result.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Read data to synchronize.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - ERROR (version 9.4.0.0-343, build 0.0 from 2022-11-08 07.50.27 by buildguy) : Step [Mask MSISDN.0] failed to initialize! 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Select values.0] initialized flawlessly. 2023/11/23 06:15:20 - sync_time_based_table_S3_bundlinguseraccount - Step [Update job with processedRows.0] initialized flawlessly. 2023/11/23 06:15:20 - Read data to synchronize.0 - Finished reading query, closing connection 2023/11/23 06:15:20 - TDS_CONNECTION - Connection to database closed! 2023/11/23 06:15:20 - TDS_CONNECTION - Commit on database connection [TDS_CONNECTION] 2023/11/23 06:15:20 - TDS_CONNECTION - Connection to database closed! 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - ERROR (version 9.4.0.0-343, build 0.0 from 2022-11-08 07.50.27 by buildguy) : Unable to prepare for execution of the transformation 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - ERROR (version 9.4.0.0-343, build 0.0 from 2022-11-08 07.50.27 by buildguy) : org.pentaho.di.core.exception.KettleException: 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - We failed to initialize at least one step. Execution can not begin! 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:1301) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.trans.Trans.execute(Trans.java:763) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:1189) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.Job.execute(Job.java:707) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.Job.execute(Job.java:848) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.Job.execute(Job.java:848) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.Job.execute(Job.java:848) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.Job.execute(Job.java:592) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:69) 2023/11/23 06:15:20 - Process S3_bundlinguseraccount - at java.lang.Thread.run(Thread.java:748) 2023/11/23 06:15:20 - sync_time_based_table_TDS_S3_bundlinguseraccount - Finished job entry [Process S3_bundlinguseraccount] (result=[false]) 2023/11/23 06:15:20 - sync_time_based_table_TDS_S3_bundlinguseraccount - Finished job entry [Build S3 file location] (result=[false]) 2023/11/23 06:15:20 - sync_time_based_table_TDS_S3_bundlinguseraccount - Finished job entry [Simple evaluation] (result=[false]) 2023/11/23 06:15:20 - Synchronize table - EmbeddedMetastore objects have been disposed. 2023/11/23 06:15:20 - sync_one_table_time_based - Starting entry [Update Job: ERROR] 2023/11/23 06:15:20 - sync_one_table_time_based - exec(14, 0, Update Job: ERROR.0) 2023/11/23 06:15:20 - Update Job: ERROR - Starting job entry 2023/11/23 06:15:20 - TDS_CONNECTION - New database connection defined 2023/11/23 06:15:20 - Update Job: ERROR - Running SQL :UPDATE etl_job_schedule SET state = 'ERROR' where id = 144042561; 2023/11/23 06:15:20 - TDS_CONNECTION - launch DDL statement: 2023/11/23 06:15:20 - TDS_CONNECTION - UPDATE etl_job_schedule SET state = 'ERROR' where id = 144042561 2023/11/23 06:15:20 - TDS_CONNECTION - 1 statement executed 2023/11/23 06:15:20 - TDS_CONNECTION - Connection to database closed! 2023/11/23 06:15:20 - sync_one_table_time_based - Starting entry [Continue Next Table on Error] 2023/11/23 06:15:20 - sync_one_table_time_based - exec(15, 0, Continue Next Table on Error.0) 2023/11/23 06:15:20 - Continue Next Table on Error - Starting job entry 2023/11/23 06:15:20 - sync_one_table_time_based - Finished job entry [Continue Next Table on Error] (result=[true]) 2023/11/23 06:15:20 - sync_one_table_time_based - Finished job entry [Update Job: ERROR] (result=[true]) 2023/11/23 06:15:20 - sync_one_table_time_based - Finished job entry [Synchronize table] (result=[true]) 2023/11/23 06:15:20 - sync_one_table_time_based - Finished job entry [Check Timebase Facts File Exists] (result=[true]) 2023/11/23 06:15:20 - sync_one_table_time_based - Finished job entry [Update Job: EXECUTING] (result=[true]) 2023/11/23 06:15:20 - sync_one_table_time_based - Finished job entry [Write synchronizing table to log] (result=[true]) 2023/11/23 06:15:20 - Sync Time Based Tables - EmbeddedMetastore objects have been disposed. 2023/11/23 06:15:20 - start_sync_time_based_tables - Starting entry [Success] 2023/11/23 06:15:20 - start_sync_time_based_tables - exec(9, 0, Success.0) 2023/11/23 06:15:20 - Success - Starting job entry 2023/11/23 06:15:20 - start_sync_time_based_tables - Finished job entry [Success] (result=[true]) 2023/11/23 06:15:20 - start_sync_time_based_tables - Finished job entry [Sync Time Based Tables] (result=[true]) 2023/11/23 06:15:20 - start_sync_time_based_tables - Finished job entry [Get Sync Jobs] (result=[true]) 2023/11/23 06:15:20 - start_sync_time_based_tables - Finished job entry [Get heartbeat] (result=[true]) 2023/11/23 06:15:20 - start_sync_time_based_tables - Finished job entry [Time based job pickup delay] (result=[true]) 2023/11/23 06:15:20 - Sync Time based Tables - EmbeddedMetastore objects have been disposed. 2023/11/23 06:15:20 - start_all_facts - Starting entry [Delete pid file] 2023/11/23 06:15:20 - start_all_facts - exec(4, 0, Delete pid file.0) 2023/11/23 06:15:20 - Delete pid file - Starting job entry 2023/11/23 06:15:20 - Delete pid file - File [file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync_facts_TDS_S3_SCHEDULED.pid] deleted! 2023/11/23 06:15:20 - start_all_facts - Starting entry [Success] 2023/11/23 06:15:20 - start_all_facts - exec(5, 0, Success.0) 2023/11/23 06:15:20 - Success - Starting job entry 2023/11/23 06:15:20 - start_all_facts - Finished job entry [Success] (result=[true]) 2023/11/23 06:15:20 - start_all_facts - Finished job entry [Delete pid file] (result=[true]) 2023/11/23 06:15:20 - start_all_facts - Finished job entry [Sync Time based Tables] (result=[true]) 2023/11/23 06:15:20 - start_all_facts - Finished job entry [Safe to continue?] (result=[true]) 2023/11/23 06:15:20 - start_all_facts - Finished job entry [Process Destination System and Job Type Filter] (result=[true]) 2023/11/23 06:15:20 - start_all_facts - Job execution finished 2023/11/23 06:15:20 - start_all_facts - MetaFileCache Loading Summary 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/TDS_S3_bundlinguseraccount/sync_time_based_table_S3_bundlinguseraccount.ktr was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/get_parameters.ktr was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync_one_table_time_based.kjb was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/sync_time_based_table_TDS_S3_bundlinguseraccount.kjb was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/get_sync_jobs.ktr was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/sync/time_based/utils/build_S3_file_location.ktr was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/start_sync_time_based_tables.kjb was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - FILENAME:file:///opt/bi/etl/pentaho/pentaho-etl-1.160.0/etl/pentaho/kettle/get_TDS_heartbeat_from_table.ktr was loaded 0 times from the cache. 2023/11/23 06:15:20 - start_all_facts - EmbeddedMetastore objects have been disposed. 2023/11/23 06:15:20 - Kitchen - Finished! 2023/11/23 06:15:20 - Kitchen - Start=2023/11/23 06:15:14.542, Stop=2023/11/23 06:15:20.541 2023/11/23 06:15:20 - Kitchen - Processing ended after 5 seconds.