Overview
Fiscal Event Post Processor is streaming pipeline for validated fiscal event data. Apache Kafka has been used to stream the validated fiscal event data, process it for dereference, unbundle, flatten And finally push these details to Mongo and Druid data store.
Version History
Current version : 0.1.0
Prerequisites
Before you proceed with the configuration, make sure the following pre-requisites are met
Java 8
Apache Kafka and Kafka-Connect server should be up and running.
Druid DB & MongoDB should be up and running.
Below dependent services are required :
iFix Master data service.
iFix Fiscal Event service.
Features
Fiscal Event post processor consumes the fiscal event validated data from Kafka topic named “fiscal-event-request-validated” and process it by following below steps :
Fiscal event validated data will get dereferenced. For dereferencing , pass the service ids like project id, COA id, Tenant id etc. to corresponding services - Master service & Department Entity service And get the corresponding object(s). Once the fiscal event data is dereferenced, push/send the same data to Mongo Sink and dereference Topic.
Mongo connector will pick up the data from “fiscal-event-mongodb-sink“ topic and push it to Mongo Datastore.
Unbundle consumer will pick up the dereferenced fiscal event data from dereference topic. Dereference fiscal event data will get unbundled and then flattened. Once the flattening is complete, push/send the same data to Druid Sink topic.
Flattened fiscal event data will be pushed to Druid DB from topic named : fiscal-event-druid-sink.
Interaction Diagram
Configurations and Setup
Update all the DB, Kafka producer & Consumer And URI configuration in the dev.yaml, qa.yaml, prod.yaml file.
To start the Mongo & Druid connector, follow these steps .
0 Comments