AnsweredAssumed Answered

Hadoop Copy files is not working properly when triggering the job using Kitchen.bat/kitchen.sh

Question asked by Shankar Panda on Apr 26, 2018

Hi All,

 

I need a urgent help. As My project is completely in ig data environment and dealing with hive tables, I have used Hadoop copy files to place the file in HDFS system.

 

When i trigger the job directly from Spoon, it works fine. File is getting successfully copied to HDFS system.

 

But when i run the same job using Kitchen.bat, hadoop copy file is trying to copy the files to {PENTAHO_UNZIP_DIR}/{HDFS Path}. I have given the destination environment as the connection i have created for Hadoop cluster. Basically while running using Kitchen.bat, PENTAHO installation directory is getting appended.

 

Here is the log:

 

2018/04/26 06:44:18 - Hadoop Copy Files - Processing row source File/folder source : [${srcFilePath}/] ... destination file/folder : [ /tmp/OUMFiles]... wildcard : [${srcFileName}.*\.csv]

2018/04/26 06:44:18 - Hadoop Copy Files - Folder  file:///var/app/pdi-ce-6.0.1.0-386/hdfs:/hdfs:{password}@10.1.7.9:8020/tmp/OUMFiles does not exist !

2018/04/26 06:44:18 - Hadoop Copy Files - Folder parent was created.

2018/04/26 06:44:18 - Hadoop Copy Files -

2018/04/26 06:44:18 - Hadoop Copy Files - Fetching : [file:///tmp/OUMFiles]

 

Could you please help on this.

Outcomes