DIGIT Developer's Guide - Birth Registration Application

This guide aims at enabling the readers to setup a development environment on their system, develop their very own micro-service and integrate/communicate with services running remotely on the sandbox environment.

Prerequisites:

  • Prior Knowledge of Java/J2EE.

  • Prior Knowledge of Spring Boot.

  • Prior Knowledge of REST APIs and related concepts like path parameters, headers, JSON etc.

  • Prior knowledge of Git

  • PostgreSQL

  • Kafka

  • Following services should be up and running( or else should be pointed to sandbox environment):

    • User

    • MDMS

    • Persister

    • Location

    • Localization

    • Id-Gen

    • Billing-service

    • URL-shortener

If you are starting off with a fresh linux/windows machine, we will setup a few things before we can start developing a service -

i) Install Git - Git is software for tracking changes in any set of files, usually used for coordinating work among programmers collaboratively developing source code during software development. Git can be downloaded from the following link -

Git for windows

Git for linux

ii) Install JDK8 -

JDK8 for windows

JDK8 for linux

iii) Install IDE - For creating SpringBoot/Java applications we recommend using IntelliJ IDE. IntelliJ can be downloaded from the following links -

IntelliJ for windows

IntelliJ for linux

iv) Install Kafka (version 3.2.0 which is the latest version) - Kafka is the messaging queue that DIGIT services use to communicate with each other asynchronously. To install kafka, follow the following links -

Kafka for windows

Kafka for linux

v) Install Postman - Postman is the tool we use to hit and test the APIs exposed by various services that we have. To install postman, follow the following links -

Postman for windows

Postman for linux

vi) Install Kubectl - Kubectl is the tool that we use to interact with services deployed on our sandbox environment.

https://kubernetes.io/docs/tasks/tools/install-kubectl-windows/

https://kubernetes.io/docs/tasks/tools/install-kubectl-linux/

vii) Install aws-iam-authenticator - https://docs.aws.amazon.com/eks/latest/userguide/install-aws-iam-authenticator.html

viii) Post installation of kubectl, you need to add configuration to allow access to resources present in our sandbox environment. Steps for creation of this config file are a part of the set up guides mentioned above. Once config file is created, add the following content to it -

apiVersion: v1 clusters: - cluster: certificate-authority-data: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUN5RENDQWJDZ0F3SUJBZ0lCQURBTkJna3Foa2lHOXcwQkFRc0ZBREFWTVJNd0VRWURWUVFERXdwcmRXSmwKY201bGRHVnpNQjRYRFRJd01EVXhNekV6TlRReE5sb1hEVE13TURVeE1URXpOVFF4Tmxvd0ZURVRNQkVHQTFVRQpBeE1LYTNWaVpYSnVaWFJsY3pDQ0FTSXdEUVlKS29aSWh2Y05BUUVCQlFBRGdnRVBBRENDQVFvQ2dnRUJBTkJyClN6aHJjdDNORE1VZVF5TENTYWhwbEgyajJ1bkdYSWk1QThJZjF6OTgwNEZpSjZ6OS9qUHVpY3FjaTB1VURJQnUKS3hjdVFJRkozMG1MRWg3RGNiQlh2dDRnUlppZWtlZzVZNGxDT2NlTWZFZkFHY01KdDE1RVVCUFVzdlYyclRMcQp6a0ovRzVRUUFXMmhwREJLaFBoblZJTktYN1YzOU9tMUtuTklTbllPWERsZ1g3dW9Wa3I1OFhzREFHWEVsdC9uClpyc3laM2pkMWplWS8rMXlQQzlxbkorT0QwZlRQVGdCV1hMQlFwMHZKdHVzNE1JV2JLdkhlcUZ5eWtGd2V5MmoKSzk5eU1Yb0oraUpCaFJvWGllU3ZrNnFYdG44S2l4bVJtOXZPQk1hcWpuNkwwTjc3UWNCNjVRaHNKb0tWKzBiMQp5VVpJTHVTWWVTY0Yra3h6TzFVQ0F3RUFBYU1qTUNFd0RnWURWUjBQQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCCi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFNdnF3THl6d2RUL05OWlkvanNzb0lmQmIyNDgKZ3oxSHRuSXJ4UGhaY3RrYjBSMExxeTYzRFZBMFNSN0MrWk90aTNNd3BHMkFSVHVzdG1vYm9HV3poUXlXRk16awpVMVNIZSt6S3poeGcweUpjUjliZnlxM1ZtQVVCZlQyTVV5cVl2OVg0aWxpbmV0SURQaFBuWnlPMERQTHJITGoyCkcxZy8vWmZYbmFCT2k3dlZLSXFXUUR6RlltWGkwME9vOEVoalVyMU5sQ3FISnF1dUo3TlRWQWk1cXA0Qm1xWU8KUTBrbTVxTVVHbG9ZdkNmN1lHQWREWTVnWGg4dzFVMVdaNWNub0Q4WWc3aEtlSjRMRzRram1adlNucGZrS3VxNApiVDdUSjEwUEZlWFJkek8xa2FkQ3VMQSttUlg3OEd5WEw0UTZnOFdPUlhOVDYzdXN3MnlpMXVVN1lMTT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo= server: https://3201E325058272AA0990C04346DA6E82.yl4.ap-south-1.eks.amazonaws.com name: eks_egov-dev contexts: - context: cluster: eks_egov-dev namespace: egov user: eks_egov-dev name: dev current-context: dev kind: Config preferences: {} users: - name: eks_egov-dev user: exec: apiVersion: client.authentication.k8s.io/v1alpha1 args: - token - -i - egov-dev command: aws-iam-authenticator env: - name: AWS_ACCESS_KEY value: AKIA42DEGMQ2KKCGGNXA - name: AWS_SECRET_ACCESS_KEY value: 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 - name: AWS_REGION value: ap-south-1

*** NOTE - In case you run into an error stating “error: You must be logged in to the server (Unauthorized)”, try to add sudo before the command. For example, “sudo kubectl get pods”. That should resolve the error.

Swagger Documentation & Generating Project from it:

The first step to develop any micro-service starts at preparing a Swagger documentation which details out all the APIs that the service is going to expose which the clients can consume.

Swagger is a utility which allows you to describe the structure of your APIs so that machines can read them. Readers can go through the following link to understand what swagger is in depth - What is Swagger?

Now comes the big question, why swagger?

There are a couple of reasons as to why we emphasize the creation of swagger contracts at the start of creating any micro-services.

  1. Allows us to generate REST API documentation and interact with them ( Using Swagger UI ). Interaction with the APIs helps clients to understand so as to how the APIs respond to various parameters. 

  2. Swagger codegen tool and SwaggerHub can be used to generate server stubs and client SDKs using these swagger contracts (fancy way of saying it implements all the humdrum code that is required to get the API skeleton ready, right from the controller layer to the models). This in turn allows the developers to focus on the business logic rather than worrying about creating model classes and controller code.

 

The following tutorial can be used for the creation of swagger contracts - OpenAPI 3.0 Tutorial| Swagger Tutorial For Beginners | Design REST API Using Swagger Editor 

For generating projects from swagger contracts, we use our customized swagger codegen jar.

We have to download the jar from the following link - CODEGEN JAR LINK

Following is the generic command to create API skeleton for any swagger contract:

java -jar codegen-1.0-SNAPSHOT-jar-with-dependencies.jar -l -t -u {CONTRACT_PATH } -a ARTIFACT_ID -b BASE_FOLDER

 

For this guide, the following should be the sequence to generate API skeleton using codegen jar:

  1. Go to the folder where you have downloaded the codegen jar.

  2. Execute the following command:

java -jar codegen-1.0-SNAPSHOT-jar-with-dependencies.jar -l -t -u https://raw.githubusercontent.com/egovernments/DIGIT-OSS/DIGIT-DEVELOPER-TUTORIAL/municipal-services/docs/birth-registration.yaml -a birth-registration -b digit

 

3. Update the spring-boot-starter-parent to 2.2.6-RELEASE (After updating spring boot version do maven update)

4. Put a slash in front of server.contextPath and add this property to application.properties file which helps requests handlers to serve requests -

server.contextPath=/birth-registration server.servlet.context-path=/birth-registration
  1. Add these external dependencies to pom.xml:

<dependency>    <groupId>org.flywaydb</groupId>    <artifactId>flyway-core</artifactId> </dependency> <dependency>    <groupId>org.postgresql</groupId>    <artifactId>postgresql</artifactId>    <version>42.2.2.jre7</version> </dependency>

 

Setting up database connection and adding variables in application.properties

All dependent service host urls and API endpoints should be added in application.properties. Along with it whatever properties can be overwritten during deployment should be part of this file(eg: DB url and passwords,Kafka server properties, control knobs for functionalities  etc.). To remove boilerplate code for referring variables from application.properties, we create a configuration file and autowire this configuration file wherever we need to refer to these variables. Following properties should be added for configuring database and kafka server( Use the default values, in case you want to tune kafka server that can be overwritten during deployment).

Once all the external dependencies have been added to pom.xml and these maven changes have been reloaded, the following properties should be added to application.properties file to configure database and kafka for development -

 

Kafka Configuration properties -

 

To add custom properties in application.properties file and then referencing them in your application -

Add SQL scripts to create DB using flyway:

Once the database has been configured, for creating tables in postgres DB we will use flyway. The following properties should be configured in application.properties file to enable flyway migration:

For adding the flyway scripts the following folder structure should be maintained:

 

Now, the migration files should be added in the main folder. Specific nomenclature should be followed while naming the file. The file name should be in the following format:

V[YEAR][MONTH][DAY][HR][MIN][SEC]__modulecode_ …_ddl.sql

Example: V20180920110535__tl_tradelicense_ddl.sql

 

We can reuse the flyway docker image and script already created in other services. Links for these files are attached below,copy paste this two files in db folder (this is required only while building the service on jenkins and deploying it to the DIGIT cluster and can be skipped for local development):

Flyway Docker Image

Script to run flyway migration

 

For this sample service, we will be using the following psql script to create the required tables - 

Project Structure:

We maintain the following project service for all microservices -

Project structure can also be looked at this link - Git Link


Adding MDMS data:

MDMS data is the master data used by the application. This data is stored as JSON on git in the format given below. For any service delivery typical master data will be the allowed values for certain fields and the taxheads (in case if payment is present in the flow).

 

MDMS json files are in the following format -

 

Once data is added to mdms repository, mdms service has to be restarted which will then load the newly added/updated mdms configs. A sample mdms config file can be viewed here - Sample MDMS data file

 

Adding Workflow configuration:

Workflow configuration should be created based on the business requirements. The configuration can be inserted using the /businessservice/_create API. To create the workflow configuration refer the following documentation: Configuring Workflows For New Product/Entity

 

Workflow configuration create request for the sample birth-registration service that we are creating in this guide:

 

Import core service models:

Models/POJOs of the dependent service can be imported from digit-core-models library (work on creating library is ongoing). These models will be used in integration with the dependent services.

These are pre-written libraries which contain tracer support, common models like MDMS, Auth and Auth and capability to raise custom exceptions.

Once these core models are imported, it is safe to delete the RequestInfo, ResponseInfo classes from the models folder and use the ones present under common contract which we just imported.

 

Before starting development, create/update the following classes -

a) Under models folder, create RequestInfoWrapper POJO -

b) Under config folder, create BTRConfiguration and MainConfiguration classes -

 

Controller Layer:

Controller Layer contains the REST API endpoints which the service wants to expose. The code flow will start from this class. Controller class should be marked with @RestController annotation. 

Adding @RestController is a convenient way of combining @Controller and @Response body annotations which eliminates the need to annotate each of the request handler methods of the Controller class with @ResponseBody annotation. 

Also @RequestMapping("/v1") annotation should be added on top of the controller class. It will contain the version of the API(This will become part of the API endpoint url)

 

Each of the handler methods should be annotated with @RequestMapping to expose them as REST API endpoints. Example - @RequestMapping(value="/document/_create", method = RequestMethod.POST)

Any request handler in the controller layer is going to have the following sequence of execution - 

  1. Making a call to the method in the Service layer and getting the response back from it.

  2. Building responseInfo.

  3. Building final response to be returned to the client.

For this guide, our controller class will contain the following content -

*** NOTE: At this point, your IDE must be showing a lot of errors but do not worry we will add all dependent layers as we progress through this guide and the errors will go away.

Since the codegen jar creates the search API with search parameters annotated with @RequestParam rather than taking request parameters as a POJO. For this, we will create a POJO by the name of BirthApplicationSearchCriteria under models folder. Put the following content in the POJO -

Also, create a utils folder under digit. Add a new java class under utils folder by the name of ResponseInfoFactory. Put the following content in this newly created class -

Service Layer:

Request handlers in the Controller layer call upon the methods defined in the Service layer to perform business logic on the RequestData and prepare the Response to be returned back to the client.

For this guide, to create the service layer -

  1. Create a new folder within digit folder called service.

  2. Create a new class in this folder by the name BirthRegistrationService.

  3. Annotate the class with @Service annotation.

  4. Add the following content to the class -

*** NOTE: At this point, your IDE must be showing a lot of errors but do not worry we will add all dependent layers as we progress through this guide and the errors will go away.

 

The service layer typically contains the following layers - 

  1. Validation Layer - All business validation logic should be added in this class. For example, verifying the values against the master data, ensuring non duplication of data etc.

In this guide, for creating the validation layer, the following steps should be followed -

a. Create a folder under digit by the name of validators. This is being done so that we keep the validation logic separate so that the code is easy to navigate through and readable.

b. Create a class by the name of BirthApplicationValidator

c. Annotate the class with @Component annotation and put the following content in the class -

 

*** NOTE: For the sake of simplicity the above mentioned validations have been implemented. Required validations will vary on case by case basis.

2. Enrichment Layer - This layer will enrich the request. System generated values like id, auditDetails etc. will be generated and added to the request.

In this guide, for creating the enrichment layer, the following steps should be followed -

a. Create a folder under digit by the name of enrichment. Again, this is being done so that enrichment code is separate from business logic so that the codebase is easy to navigate through and readable.

b. Create a class by the name of BirthApplicationEnrichment

c. Annotate the class with @Component and add the following methods to the class -

*** NOTE: For the sake of simplicity the above mentioned enrichment methods have been implemented. Required enrichment will vary on case by case basis.

3. Integration - A separate class should be created for integrating with each dependent microservice. Only one method from that class should be called from the main service class for integration.

In this guide, we will be showcasing how we can integrate our microservices with other microservices like MDMS, IdGen, User and Workflow.

For interacting with other microservices, we can create and implement the following ServiceRequestRepository class under repository folder -

a. Integration with MDMS - Integration with MDMS requires the following steps to be followed:

i) Add a new MDMS file in MDMS repo. For this guide, a sample MDMS file has already been added which can be found here - egov-mdms-data/BtrCharges.json at DEV · egovernments/egov-mdms-data

ii) Restart MDMS service after adding the new file.

iii) Once restarted, hit the curl mentioned below to verify that the new file has been properly added -

iv) Once verified, we can call mdms service from within our application and fetch the required master data. For this, create a java class by the name of MdmsUtil under utils folder. Annotate this class with @Component and put the following content in the class -

v) Add the following properties in application.properties file -

b. Integration with IdGen - Integration with Idgen requires the following steps -

i) Add the Id Format that needs to be generated in this file - Id Format Mdms File

ii) For this tutorial, the following Id format has been added as part of this PR - Tutorial Id Format

iii) Now, restart IDGen service and Mdms service and port-forward IDGen service to port 8285 using -

iv) Hit the following curl to verify that the format is added properly -

v) Once verified, we can call idgen service from within our application and generate registrationId. For this, create a java class by the name of IdgenUtil under utils folder. Annotate this class with @Component and put the following content in the class -

Add the following model POJOs under models folder -

vi) Add the following properties in application.properties file -

c. Integration with User service - Integration with user service requires the following steps -

i) Create a new class under utils by the name of UserUtil and update the User POJO to have the following content -

ii) Annotate the created UserUtil class with @Component and add the following code in the created class -

iii) Create the following POJOs -

v) Create a class by the name of UserService under service folder and add the following content to it -

vi) Add the following properties in application.properties file -

d) Integration with URL Shortener - Integration with URL shortener requires the following steps -

i) Create a new class by the name of UrlShortnerUtil

ii) Annotate this class with @Component and add the following code -

iii) Add the following properties in application.properties file -

e) Integration with workflow - Integration with workflow service requires the following steps -

i) Update the BirthRegistrationApplication POJO to have following contents -

Create ProcessInstance, State, Action, ProcessInstanceRequest, ProcessInstanceResponse , BusinessService , BusinessServiceResponse POJOs -

ii) Next, we have to create a class to transition the workflow object across its states. For this, create a class by the name of WorkflowService and annotate it with @Service annotation. Add the following content to this class -

iii) Add the following properties to application.properties file -

  1. Calculation - The calculation class will contain the calculation logic for given service delivery. Based on the application submitted the calculator class will calculate the tax/charges and call billing service to generate demand.

For our guide, we are going to create a sample calculation class with some dummy logic. For this, we are going to perform the following steps -

i) Create a class under service folder by the name of CalculationService

ii) Now, annotate this class with @Service annotation and add the following logic within it -

Repository Layer:

Methods in the service layer, upon performing all the business logic, call methods in the Repository layer to persist or lookup data i.e. it interacts with the configured data store. For executing the queries, JdbcTemplate class is used. JdbcTemplate takes care of creation and release of resources such as creating and closing the connection etc. All database operations namely insert, update, search and delete can be performed on the database using methods of JdbcTemplate class.

On DIGIT however, we handle create and update operations asynchronously. Our persister service listens on the topic to which service applications are pushed for insertion and updation. Persister then takes care of executing insert and update operations on the database without hogging our application’s threads.

That leaves us with execution of search queries on the database to return applications as per the search parameters provided by the user.

For this guide, these are the steps that we will be taking to implement repository layer -

i) Create querybuilder and rowmapper folders within repository folder.

ii) Create a class by the name of BirthApplicationQueryBuilder in querybuilder folder and annotate it with @Component annotation. Put the following content in BirthApplicationQueryBuilder class -

iii) Next, create a class by the name of BirthApplicationRowMapper under rowmapper folder and annotate it with @Component. Add the following content in the class -

iv) Finally, create a class by the name of BirthRegistrationRepository under repository folder and annotate it with @Repository annotation. Put the following content into the class -

Producer:

Producer classes help in pushing data from the application to kafka topics. For this, we have a custom implementation of KafkaTemplate class in our tracer library called CustomKafkaTemplate. This implementation of producer class does not change across services of DIGIT. Producer implementation can be viewed here - Producer Implementation

Now, for adding producer support in this guide, the following steps need to be followed -

i) Update tracer version in pom.xml to 2.0.0-SNAPSHOT

ii) Create a producer folder and add a new class to it by the name of Producer. Add the following code the this class -

Consumers:

Customized SMS creation: Once an application is created/updated the data is pushed on kafka topic. We trigger notification by consuming data from this topic. Whenever any message is consumed the service will call the localisation service to fetch the SMS template. It will then replace the placeholders in the SMS template with the values in the message it consumed(For example: It will replace the {NAME} placeholder with owner name from the data consumed). Once the SMS text is ready, the service will push this data(Create the SMSRequest object which is part of common modules and push the object) on notification topic. (SMS service consumes data from notification topic and triggers SMS).

For our guide, we will be implementing a notification consumer in the following section.

Notification:

Once an application is created/requested or progresses further in the workflow, notifications can be triggered as each of these events are pushed onto kafka topics which can be listened on and a sms/email/in-app notification can be sent to the concerned user(s).

For our guide, we will be implementing a notification consumer which will listen onto the topic on which birth registration applications are created, create a customized message and send it to the notification service(sms/email) to be sent to the concerned users.

Create a POJO by the name of SMSRequest under models folder and add the following content into it -

Next, to handle preparation of customized message and pushing the notification we will create a class by the name of NotificationService under service folder. Annotate it with @Service annotation and add the following content to it -

Payment Backupdate:

Once payment is done the application status has to be updated. Since we have a microservice architecture the two services can communicate with each other either through API calls or using message queues(kafka in our case). To avoid any service specific code in collection service we use the second approach to notify the service of payment for its application. Whenever a payment is done the collection service will publish the payment details on a kafka topic. Any microservice which wants to get notified when payments are done can subscribe to this topic. Once the service consumes the payment message it will check if the payment is done for its service by checking the businessService code. If it is done for the given service it will update the application status to PAID or will trigger workflow action PAY depending on the use case.

For our guide, we will follow the following steps to create payment backupdate consumer -

i) Create a consumer class by the name of PaymentBackUpdateConsumer. Annotate it with @Component annotation and add the following content to it -

ii) Next, under service folder create a new class by the name of PaymentUpdateService and annotate it with @Service. Put the following content in this class -

iii) Create the following POJOs under models folder -

Persister configurations:

The persister configuration is written in a YAML format. The INSERT and UPDATE queries for each table is added in prepared Statement format, followed by the jsonPaths of values which has to be inserted/updated.

For example, for a table named studentinfo with id, name, age, marks fields, the following configuration will get the persister ready to insert data into studentinfo table -

 

For our guide, for adding persister config, the following steps need to be followed -

i) Clone configs repo locally.

ii) Create a file by the name of digit-developer-guide.yml under persister configs folder.

iii) Add the following content into it -

Indexer Configuration:

Indexer is designed to perform all the indexing tasks of the digit platform. The service reads records posted on specific kafka topics and picks the corresponding index configuration from the yaml file provided by the respective module configuration. Configurations are yaml based. Detailed guide to create indexer configs are mentioned in the following document - Indexer Configuration Guide .

For our guide, we will create a new file under egov-indexer in configs repo by the name of digit-developer-guide.yml and put the following content into it -

 

Certificate Generation:

The final step in this process is the creation of configs to create a birth registration PDF for the citizens to download. For this, we will make use of DIGIT’s PDF service which uses PDFMake and Mustache libraries to generate PDF. A detailed documentation on PDF service and generating PDFs using PDF service can be found here - PDF Generation Service.

For our guide, we will follow the following steps to set up PDF service locally and generate PDF for our birth registration service -

i) Clone DIGIT Services repo.

ii) Clone Configs repo.

iii) Now, go into the DIGIT-Dev repo and open up a terminal. Checkout DIGIT_DEVELOPER_GUIDE branch.

iv) Now, go inside configs folder and under pdf-service data config and format config folders, create file by the name of digit-developer-guide.json

Add the following content in this newly created data config file -

v) Now, under format-config folder, again create a file by the name of digit-developer-guide.json and put the following content into it -

vi) Now, open PDF service (under core-services repository of DIGIT-Dev) on your IDE. Open Environment.js file and change the following properties to point to the local config files that have been created. For example, in my local setup I have pointed these to the local files that I created -

vii) Now, make sure that kafka and workflow services are running locally and port-forward the following services -

egov-user to port 8284

egov-localization to port 8286

egov-filestore to 8288

egov-mdms to 8082

viii) PDF service is now ready to be started up. Execute the following commands to start it up -

ix) Once PDF service is up hit the following CURL to look at the created PDF -

 

Congratulations for making it through this guide. Once all the dependent services are configured/integrated, it is time to test our completed application!

To run and test our sample application, the following steps need to be followed -

  1. Run kafka, workflow, pdf service locally and the code of DIGIT_DEVELOPER_GUIDE branch (for consistency).

  2. Port-forward the following services -

a. egov-user to port 8284

b. egov-localization to port 8286

c. egov-filestore to 8288

d. egov-mdms to 8082

  1. Run birth-registration-service that we just created.

  2. Import the postman collection of the API’s that this sample service exposes from here - Voter Registration Postman Collection

  3. Hit the _create API request to create a birth registration application.

  4. Hit the _search API request to search for the created birth registration applications.

  5. Hit _update API request to update your application or transition it further in the workflow by changing the actions.