Search Options
Skip to main content (Press Enter).
Sign In
Skip auxiliary navigation (Press Enter).
Skip main navigation (Press Enter).
Toggle navigation
Search Options
Communities
General Discussion
My Communities
Explore All Communities
Products
Solutions
Services
Developers
Champions Corner
Customer Stories
Insights
Customer Advocacy Program
Badge Challenges
Resources
Resource Library
Hitachi University
Product Documentation
Product Downloads
Partners Portal
How To
Get Started
Earn Points and Badges
FAQs
Start a Discussion
Profile
My Contributions
List of Contributions
Profile
Andry RAKOTONDRASOA
This individual is no longer active. Application functionality related to this individual is limited.
Contact Details
×
Enter Password
Enter Password
Confirm Password
Andry RAKOTONDRASOA
This individual is no longer active. Application functionality related to this individual is limited.
Profile
Connections
Contacts
Contributions
Badges
List of Contributions
My Content
1 to 12 of 12 total
search criteria =
ALL
RE: Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
10-11-2019 15:27
Found In
Community:
Pentaho
\
view thread
Hi All, After a few weeks of testing, I still have the same error on my job Pentaho Kafka consumer. We must put Jobs into production in mid-November and I have two weeks to find a solution for the Kafka consumer job. if you have questions or something to clarify, please dont hesitate to comment. Regards ...
RE: Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
05-13-2019 14:30
Found In
Community:
Pentaho
\
view thread
Hi Jens Bleuel, We have already tried your first proposal because we had to reduce the size of the log file (catalina.out). And the file execution mode of ETLs returns the same errors : 2019/05/13 16:03:14 - grfs-message.0 - Fin exécution étape (Entrées=0, Sorties=0, Lues=10000, Ecrites=10000, Maj=0, ...
RE: Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
05-09-2019 09:21
Found In
Community:
Pentaho
\
view thread
Hi Virgilio Pierini, At the moment, were only testing Kafka consumers with transformations whose a batch configuration is 3000 ms or 1000 row. As I said in my previous comment, the transformation have memory problem after a few hours with a topic without LAG in Kafka and 2 minutes for a topic with LAG ...
RE: Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
05-07-2019 08:09
Found In
Community:
Pentaho
\
view thread
Hi Ricardo Diaz, Please find attached the screenshots of etl. Both transformations (screenshots_1 and screenshot_2 ) have the same memory problem after a few hours with a topic without LAG in Kafka and 2 minutes for a topic with LAG (LAG > 4500000 ). Regards, Andry
RE: Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
05-03-2019 11:52
Found In
Community:
Pentaho
\
view thread
Hi Steven Brown, Thanks for your reply. We have already tested this configuration but we still have the same error : OutofMemory. With a 5000000 LAG in the kafka cluster, the etl reaches 10GB of memory usage in 45 seconds. So, if it is related to the pentaho environment, using the etl Kafka consumer ...
Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
04-29-2019 09:23
Found In
Community:
Pentaho
\
view thread
This content was either too long or contained formatting that did not work with our migration. A PDF document is attached that contains the original representation We are trying for to run a Kafka consumer transformation in Pentaho Server 8.1. The transformation subscribes to a single topic that sends ...
Kafka consumer java.lang.OutOfMemoryError: GC overhead limit exceeded
Posted By
Andry RAKOTONDRASOA
04-29-2019 09:23
Found In
Library:
Pentaho
RE: Fields not found in MongoDB output from Kafka consumer
Posted By
Andry RAKOTONDRASOA
02-19-2019 13:34
Found In
Community:
Pentaho
\
view thread
Hi Ravikumar Kamma, Thanks for your reply. I found the solution on this error. By default, the Kafka consumer input and output fields have a same name but not a same format. Thats why it can insert in a SGBDR (postgresql) but not in mongodb. In short, I think that its a case sensitivity problem. Reg ...
RE: Get string from month integer
Posted By
Andry RAKOTONDRASOA
02-18-2019 14:50
Found In
Community:
Pentaho
\
view thread
Hi, Your transformation is good but just remplace Meta-data > Format MMMM to 0000 in month_string step. Regards
Fields not found in MongoDB output from Kafka consumer
Posted By
Andry RAKOTONDRASOA
02-18-2019 13:39
Found In
Community:
Pentaho
\
view thread
Hi everyone, Im new to mongodb and @data @streaming I have a transformation that must insert a data streaming from Kafka to MongoDB but in the 3 versions of PDI (8.0, 8.1 and 8.2), the MongoDb return the same error : 2019/02/18 13:49:32 - mdbo-centreon-alert.0 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 ...
Fields not found in MongoDB output from Kafka consumer
Posted By
Andry RAKOTONDRASOA
02-18-2019 13:39
Found In
Library:
Pentaho
RE: Problem with transformation hour formats
Posted By
Andry RAKOTONDRASOA
10-18-2018 22:13
Found In
Community:
Pentaho
\
view thread
Hello Tereza, Please find attached a transformation using Regex in a script step. Regards
© Hitachi Vantara LLC 2023. All Rights Reserved.
Powered by Higher Logic