Skip navigation
Log in to follow, share, and participate in this community.

Recent Activity

new niu
I use the pan or kitchen command to call hadoop file output will always write to the local disk I use  “pdi-ce-8.2.0.0-342\data-integration ” this version and “\pdi-ce-8.2.0.0-342\data-integration\plugins\pentaho-big-data-plugin\hadoop-configurations\cdh514” this is plugin path   my ktr file use 2 step  . first  random a num  then  save it to… (Show more)
in Big Data
vanna an
i am new to Pentaho BA, i want to connect to datasource with my apache spark cluster, what drive need to put in tomcate/lib?
in Big Data
vanna an
Click to view contenti have created new datasource that connect to hive-thrift jdbc. it connect successfully but not show tables      
in Big Data
pandit gurav
Hi  all, i am running spark job  in cluster through pdi using ael daem. i am getting below error 2018/12/27 21:09:32 - Carte - Installing timer to purge stale objects after 1440 minutes. 2018/12/27 21:10:10 - Spoon - Transformation opened. 2018/12/27 21:10:10 - Spoon - Launching transformation [spark-submit]... 2018/12/27 21:10:10 - Spoon -… (Show more)
in Big Data