...
Open the Druid console
Go to the Load Data section
Select Other
Click on Submit Supervisor
Copy...Paste the JSON from the below file in the available text box.
Code Block { "type": "kafka", "spec": { "ioConfig": { "type": "kafka", "consumerProperties": { "bootstrap.servers": "kafka-v2.ifix:9092" }, "topic": "fiscal-event-druid-sink", "inputFormat": { "type": "json" }, "useEarliestOffset": true }, "tuningConfig": { "type": "kafka" }, "dataSchema": { "dataSource": "fiscal-event", "timestampSpec": { "column": "eventTime", "format": "millis" }, "transformSpec": { "transforms": [ { "type": "expression", "name": "bill", "expression": "if(\"eventType\" == 'BILL', \"amount\", 0)" }, { "type": "expression", "name": "receipt", "expression": "if(\"eventType\" == 'RECEIPT', \"amount\", 0)" }, { "type": "expression", "name": "payment", "expression": "if(\"eventType\" == 'PAYMENT', \"amount\", 0)" }, { "type": "expression", "name": "demand", "expression": "if(\"eventType\" == 'DEMAND', \"amount\", 0)" } ] }, "dimensionsSpec": { "dimensions": [ "id", { "type": "double", "name": "amount" }, "version", "tenantId", "eventId", { "type": "long", "name": "ingestionTime" }, "eventType", "referenceId", "parentEventId", "parentReferenceId", { "type": "long", "name": "fromBillingPeriod" }, { "type": "long", "name": "toBillingPeriod" }, "coa.id", "coa.coaCode", "coa.majorHead", "coa.majorHeadName", "coa.subMajorHead", "coa.subMajorHeadName", "coa.minorHead", "coa.minorHeadName", "coa.subHead", "coa.subHeadName", "coa.groupHead", "coa.groupHeadName", "coa.objectHead", "coa.objectHeadName", "department.id", "department.code", "department.name", "departmentEntity.id", "departmentEntity.code", "departmentEntity.name", "departmentEntity.hierarchyLevel", "departmentEntity.ancestry[0].id", "departmentEntity.ancestry[0].code", "departmentEntity.ancestry[0].name", "departmentEntity.ancestry[0].hierarchyLevel", "departmentEntity.ancestry[1].id", "departmentEntity.ancestry[1].code", "departmentEntity.ancestry[1].name", "departmentEntity.ancestry[1].hierarchyLevel", "departmentEntity.ancestry[2].id", "departmentEntity.ancestry[2].code", "departmentEntity.ancestry[2].name", "departmentEntity.ancestry[2].hierarchyLevel", "departmentEntity.ancestry[3].id", "departmentEntity.ancestry[3].code", "departmentEntity.ancestry[3].name", "departmentEntity.ancestry[3].hierarchyLevel", "departmentEntity.ancestry[4].id", "departmentEntity.ancestry[4].code", "departmentEntity.ancestry[4].name", "departmentEntity.ancestry[4].hierarchyLevel", "departmentEntity.ancestry[5].id", "departmentEntity.ancestry[5].code", "departmentEntity.ancestry[5].name", "departmentEntity.ancestry[5].hierarchyLevel", "departmentEntity.ancestry[6].id", "departmentEntity.ancestry[6].code", "departmentEntity.ancestry[6].name", "departmentEntity.ancestry[6].hierarchyLevel", "departmentEntity.ancestry[7].id", "departmentEntity.ancestry[7].code", "departmentEntity.ancestry[7].name", "departmentEntity.ancestry[7].hierarchyLevel", "departmentEntity.ancestry[8].id", "departmentEntity.ancestry[8].code", "departmentEntity.ancestry[8].name", "departmentEntity.ancestry[8].hierarchyLevel", "departmentEntity.ancestry[9].id", "departmentEntity.ancestry[9].code", "departmentEntity.ancestry[9].name", "departmentEntity.ancestry[9].hierarchyLevel", "departmentEntity.ancestry[10].id", "departmentEntity.ancestry[10].code", "departmentEntity.ancestry[10].name", "departmentEntity.ancestry[10].hierarchyLevel", "expenditure.id", "expenditure.code", "expenditure.name", "expenditure.type", "government.id", "government.name", "project.id", "project.code", "project.name", { "name": "bill", "type": "double" }, { "name": "receipt", "type": "double" }, { "name": "demand", "type": "double" }, { "name": "payment", "type": "double" } ] }, "granularitySpec": { "queryGranularity": "none", "rollup": false, "segmentGranularity": "hour" } } } }
You should verify the Kafka topic name and the Kafka bootstrap server address before submitting the config.
Now submit the config and the data ingestion should start into the fiscal-event data source.
Interaction Diagram
Environment
There will not be any environment variables required specific to the environment (migration).
Configurations and Setup
Update all the DB, Kafka producer & Consumer And URI configuration in the dev.yaml, qa.yaml, prod.yaml file.
...