Pentaho

 View Only

 Fields not found in MongoDB output from Kafka consumer

  • Pentaho
  • Kettle
  • Pentaho
  • Pentaho Data Integration PDI
Andry RAKOTONDRASOA's profile image
Andry RAKOTONDRASOA posted 02-18-2019 13:39

Hi everyone, I'm new to mongodb and @data @streaming

I have a transformation that must insert a data streaming from Kafka to MongoDB but in the 3 versions of PDI (8.0, 8.1 and 8.2), the MongoDb return the same error :

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : Unexpected error

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : org.pentaho.di.core.exception.KettleException:

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - Some expected Mongo fields not found in step input fields. Check step configuration. Mongo fields not found: 'message', 'topic', 

2019/02/18 13:49:32 - mdbo-centreon-alert.0 -

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - at org.pentaho.di.trans.steps.mongodboutput.MongoDbOutput.checkInputFieldsMatch(MongoDbOutput.java:514)

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - at org.pentaho.di.trans.steps.mongodboutput.MongoDbOutput.processRow(MongoDbOutput.java:148)

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)

2019/02/18 13:49:32 - mdbo-centreon-alert.0 - at java.lang.Thread.run(Unknown Source)

 

I checked that the two columns (message and topic) exist in the steps that precede the mongodb output.

I made the same insertion in databases (table output) and file but the error does not occur.

Please, can someone help me to solve this error because I'm stuck on this step ? (in attached the transformations files)

 

Thanks in advance.


#Kettle
#Pentaho
#PentahoDataIntegrationPDI
Ravikumar Kamma's profile image
Ravikumar Kamma

I am not able to understand much from the error.

Could you please try mentioning DB and collection name in Output option.

Not sure if variable substitution is supported here.

pastedimage_1

~Ravik

Andry RAKOTONDRASOA's profile image
Andry RAKOTONDRASOA

Hi Ravikumar Kamma,

Thanks for your reply.

I found the solution on this error. By default, the Kafka consumer input and output fields have a same name but not a same format.

That's why it can insert in a SGBDR (postgresql) but not in mongodb.

In short, I think that it's a case sensitivity problem.

pastedimage_2

Regards

Data Conversion's profile image
Data Conversion
Data Conversion's profile image
Data Conversion