I have a very simple transformation that uses a get system info step to get a start and finish date, it passes those as a row to a table input step and then writes the resulting rows to an AWS S3 step.
It runs with no problem locally as a transformation, I can launch it remotely using Carte and it works.
If I put it inside a job and run it locally it works (simple job - Start, Check DB, Transformation, send confirmation email, Sucess).
If I run the same job remotely it runs, it sends the success email so it definitely runs, but it doesn't generate the file for the S3 step.
It seems like the table input step is returning zero rows, almost as if the get system info step is not passing the two date stamps.
Any help much appreciated.
Very new to all this, apologies for all the questions.
Figured it out, sharing in case anyone has this issue in future.
What I ended up doing was switching from sending the dates from the get system info step to the query as a row, using question marks in the query, to breaking the transformation in to two; the first setting variables at job level and the second doing the rest of the transformation with the variables in the table input in ${variable} notation.
I don't really understand why the original worked everywhere except on the remote server, bu this has fixed the problem.