So big picture what you are doing is processing say 1billion rows, loading them into memory (your issue) passing them in one go to the next job and transformation and so on. You may or may not have enough memory.
What I am suggesting you do is to process a billion rows and output them into table A,
Next job reads table A, massages, outputs to table B and so forth. (let's call this process X)
Further to this, if you need a loop that runs process X values 1,2,3 (tomorrow 4,5,6 and so on)
You would:
1. generate a transformation that determines the values for today (i.e. 1,2,3)
2. passes the values in step 1, one row at a time) to sub job/transformation Y
Use the copy rows to result sparingly.