Search Options
Skip to main content (Press Enter).
Sign In
Skip auxiliary navigation (Press Enter).
Skip main navigation (Press Enter).
Toggle navigation
Search Options
Communities
General Discussion
My Communities
Explore All Communities
Products
Solutions
Services
Developers
Champions Corner
Customer Stories
Insights
Customer Advocacy Program
Badge Challenges
Resources
Resource Library
Hitachi University
Product Documentation
Product Downloads
Partners Portal
How To
Get Started
Earn Points and Badges
FAQs
Start a Discussion
Profile
My Contributions
List of Contributions
Profile
Rajkumar Venkatasamy
This individual is no longer active. Application functionality related to this individual is limited.
Contact Details
×
Enter Password
Enter Password
Confirm Password
Rajkumar Venkatasamy
This individual is no longer active. Application functionality related to this individual is limited.
Profile
Connections
Contacts
Contributions
Badges
List of Contributions
My Content
1 to 8 of 8 total
search criteria =
ALL
Select current datetime from Database instead of PDI server time using Table Input step
Posted By
Rajkumar Venkatasamy
10-11-2021 12:04
Found In
Community:
Pentaho
\
view thread
Hi Pentaho Community users, I am using PDI community edition version 9.0. The table input step has a simple sql query pointed to postgres database connection and the statement is as shown below: select now() I expected that PDI will execute the query in the target database (which is running in Pacific ...
Sometimes getting the error while executing Pentaho Job. Unable to create lock file: \.kettle\environment\metastore\.lock
Posted By
Rajkumar Venkatasamy
12-16-2020 01:57
Found In
Community:
Pentaho
\
view thread
Environment detail: PDI Version 9 (Community Edition) Kettle Environment plugin : 1.7.1 Details At random manner, getting the below error while executing pentaho job. This error especially occurs in any of the Mapping - Sub transformation steps that exists in the project. The frequency of this ...
RE: PDI Kafka Consumer in version 9.0 is resulting into io.reactivex.exceptions.MissingBackpressureException
Posted By
Rajkumar Venkatasamy
05-27-2020 08:07
Found In
Community:
Pentaho
\
view thread
Hi Sergio, Yes, I am using the recommended Java version by Pentaho. If you dont have any experience with Kafka Consumer part. Could you please direct me whom I could get connected with in Pentaho Community front / JIRA project url for raising a issue on Pentaho Kafka / Big data plugin ?
PDI Kafka Consumer in version 9.0 is resulting into io.reactivex.exceptions.MissingBackpressureException
Posted By
Rajkumar Venkatasamy
05-08-2020 11:09
Found In
Community:
Pentaho
\
view thread
Hello Sergio, trying to reach you to understand, if you could help me on this reported issue in Pentaho community. More details can be found on this below link: https://community.hitachivantara.com/s/question/0D52S000080jjv3SAA Please do let me know if you need more information on this regards. ...
Pentaho Kafka Consumer in version 9.0 is resulting into io.reactivex.exceptions.MissingBackpressureException
Posted By
Rajkumar Venkatasamy
05-08-2020 10:50
Found In
Community:
Pentaho
\
view thread
I have developed a Pentaho Kafka Consumer transformation (and associated sub transformation) and that was working fine in version PDI Community edition 8.0. But the same transformations when processing the same set of messages from same topic, fails after PDI upgrade to 9.0 with the following exception: ...
RE: Kafka Consumer step in Pentaho Data Integration is not streaming the events from Kafka and the transformation gets stopped on its own immediately
Posted By
Rajkumar Venkatasamy
05-04-2020 14:31
Found In
Community:
Pentaho
\
view thread
Hi Pentaho Technical team / Community members Seems that the pentaho consumer is not functioning properly when the messages are compressed by producer. Thats seems to be the scenario in my case, when the consumer didnt work as detailed above. Our Kafka producer published messages in lz4 compression ...
Kafka Consumer step in Pentaho Data Integration is not streaming the events from Kafka and the transformation gets stopped on its own immediately
Posted By
Rajkumar Venkatasamy
04-23-2020 09:58
Found In
Community:
Pentaho
\
view thread
Hi Community members, We are exploring PDI for Kafka streaming / ETL purpose. As part of which , I tried using Kafka Consumer step to consume the events stream from Kafka server. The same Kafka server is working fine with other consumer applications (written in Java). But when used with Pentaho (Version ...
Kafka Consumer step in Pentaho Data Integration is not streaming the events from Kafka and the transformation gets stopped on its own immediately
Posted By
Rajkumar Venkatasamy
04-23-2020 09:58
Found In
Library:
Pentaho
© Hitachi Vantara LLC 2023. All Rights Reserved.
Powered by Higher Logic