AnsweredAssumed Answered

i have problem while submitting my spark job in yarn mode. i am getting below exception  Pandit Gurav <pandit.gurav@wildjasmine.com> 11:29 AM (2 hours ago) to me  2018/12/27 21:09:32 - Carte - Installing timer to purge stale objects after 1440 minutes. 20

Question asked by pandit gurav on Dec 28, 2018


Hi  all,

i am running spark job  in cluster through pdi using ael daem. i am getting below error

2018/12/27 21:09:32 - Carte - Installing timer to purge stale objects after 1440 minutes.

2018/12/27 21:10:10 - Spoon - Transformation opened.

2018/12/27 21:10:10 - Spoon - Launching transformation [spark-submit]...

2018/12/27 21:10:10 - Spoon - Started the transformation execution.

2018/12/27 21:11:29 - Hadoop File Input.0 - Hadoop File Input initialized successfully

2018/12/27 21:11:29 - Hadoop File Output.0 - Hadoop File Output initialized successfully

2018/12/27 21:11:40 - Hadoop File Input.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 6, ip-10-222-113-51.eu-west-1.compute.internal, executor 1): java.lang.IllegalStateException: org.pentaho.di.core.exception.KettleException:

Unable to read file '/home/yarn/.kettle/kettle.properties'

/home/yarn/.kettle/kettle.properties (No such file or directory)

Outcomes