Skip navigation
Log in to follow, share, and participate in this community.

Recent Activity

B167W1XV
I'm working on a Transformation which selects from an Oracle database, inserts into a mapping table, performs a couple of lookups and mappings, then inserts into an PostgreSQL database. When I tested with a handful of mocked up records, it worked as expected.   Now I'm running against a Test data set. When I preview the Oracle table input… (Show more)
in Data Integration
Ken Wood
Easy to Use Deep Learning and Neural Networks with Pentaho By Ken Wood and Mark Hall   Hitachi Vantara Labs is excited to release a new version of the experimental plugin, Plugin Machine Intelligence version 1.4. Along with several minor fixes and enhancements is the addition of a new execution engine for performing deep learning and… (Show more)
Daniel Lee
Click to view contentI have an HTTP step (in Job) that connects to a web to download a file via a basic authentication, which works great on Mac OS-X. However, it does not work on Windows with the following error:   ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : I was unable to save the HTTP result to file because of a I/O error: Server… (Show more)
in Data Integration
Clifford Grimm
I have a transformation that uses a "streaming" input step.  That is the input step is always listening for incoming messages to process, thus never finishes.  Because of some complications with the transformation w.r.t. creation of temporary files, the best solution we have found to help avoid the issues was to use a Single Threader step to call… (Show more)
in Data Integration
Jose Barrioss Barrios
Good morning, I need the file "oracle.rdb.jdbc.rdbthin.driver" and I can not find it on the oracle website.   Can u share me?   Thanks!
in Data Integration
Paul Horan
1. Read a .CSV file from a specified folder location.  Let's keep it simple and assume there's only one - myData1.csv.     2. Derive the metadata about the fields in that file.  Names, datatypes, precision, etc...    3. Generate out a CREATE OR REPLACE TABLE myData1 (...) that exactly matches the CSV.   4. Use a generic JDBC connection to… (Show more)
in Data Integration
Callian  Dutra
Hi,   I have a transformation that consume a WebService through HTTP Post. The response it's a binary file compressed, that i need decompress to get the real result (a XML File).   My difficult it's in decompress this result. Any one know how to do it and which's the best scenario?
in Data Integration
Neeraj Arora
centos 7 64 bit Java 8 Pentaho 8.0   We have jobs scheduled through Control-M. Facing below error intermittently for different jobs. Same Job completes successfully after rerun.     DEBUG: _PENTAHO_JAVA_HOME=/usr/java/latest DEBUG: _PENTAHO_JAVA=/usr/java/latest/bin/java… (Show more)
in Data Integration
Giuseppe La Rosa
Click to view contentHi everyone,   I would like to ask the help of the Pentaho community in solving this issue I have with Pentaho. I'm using Pentaho 7.1 - Community Edition.   The problem I have is that it happens really often to me that Pentaho fills all the available memory and gets stuck when running jobs that contains jobs inside.   To better clarify what I… (Show more)
in Data Integration
Load more items