Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »

In Airflow, a DAG –Directed Acyclic Graph – is a collection of all the tasks you want to run, organized in a way that reflects their relationships and dependencies.

A DAG is defined in a Python script, which represents the DAGs structure (tasks and their dependencies) as code.

How to run DAG in Airflow?

Manual Trigger

1.Log onto the Punjab Prod server using the credentials:

URL:https://airflow.mseva-qa.lgpunjab.gov.in/login/?next=http%3A%2F%2Fairflow.mseva-qa.lgpunjab.gov.in%2Fhome

    username : admin

    password :  admin

 

 

2.Trigger the DAG by clicking on the “Trigger DAG with Config” option.

3.Enter date and click on Trigger button

Format {“date” : “dd-MM-yyyy”}

4.The Logs can be viewed by expanding on the DAG and choosing a stage for any module and 

Clicking on the Log option.

Logs can also be viewed in the Elastic search index adaptor_logs

GET adaptor_logs/_search

the timestamp can be provided based on the day for which the logs are being searched for

Scheduled DAG

This DAG would trigger midnight everyday for the previous day

Configure the Airflow variables

Key

Value

password

eGov@123

username

SYSTEMSU3

token

ZWdvdi11c2VyLWNsaWVudDo=

tenantid

pg

usertype

SYSYTEM

totalulb_url

https://raw.githubusercontent.com/egovernments/punjab-mdms-data/master/data/pb/tenant/tenants.json

Configure the connections

ConnectionId

Connection Type

Host

Port

Schema

Remark

es_conn

ElasticSearch

elasticsearch-data-v1.es-cluster

9200

For the ES server

digit-auth

HTTP

http://staging.digit.org

https

For the auth api conenction

  • No labels