Airflow remote logging using AWS S3

airflow s3 image

Airflow logs are stored in the filesystem by default in $AIRFLOW_HOME/dags directory, this is also called remote logging. Airflow logs can also be easily configured to be stored on AWS S3 as well. This blog entry describes the steps required to configure Airflow to store its logs on an S3 bucket. The blog entry is divided into the following sections Introduction Create S3 Connection Configure airflow.cfg Demo Introduction For this blog entry, we are running airflow on an ubuntu server which has access to AWS S3 buckets via AWS-CLI. Note: If you are using an EC2 instance please makes sure that your instance has read-write access to S3 buckets configured. Create S3 Connection To enable remote logging in airflow, we need to make use of an airflow plugin which can be installed as a part of airflow pip install command. Please refer to this blog entry for more details. Goto … Read more

Airflow RBAC – Role-Based Access Control

Airflow version 1.10.0 onward introduced Role-Based Access Control(RBAC) as part of their security landscape. RBAC is the quickest way to get around and secure airflow. It comes with pre-built roles which makes it easy to implement. Not stopping there you could add your own roles as well. In addition, to securing various features of airflow web UI, RBAC can be used to secure access to DAGs as well. Though this feature is only available after 1.10.2. And as with any software, a few bugs popped. But this was fixed in the 1.10.7. There is an entry for a bug AIRFLOW-2694. In this blog entry, we will touch upon DAG level access as well. The blog entry is divided into few parts Enable RBAC Create users using standard roles Roles & permissions Secure DAGs with RBAC Needless to say, I am assuming you have a working airflow. If not, please head … Read more

Airflow – XCOM

Introduction Airflow XCom is used for inter-task communications. Sounds a bit complex but it is really very simple. Its implementation inside airflow is very simple and it can be used in a very easy way and needless to say it has numerous use cases. Inter-task communication is achieved by passing key-value pairs between tasks. Tasks can run on any airflow worker and need not run on the same worker. To pass information a task pushes a key-value pair. The key-value pair is then pulled by another task and utilized. This blog entry requires some knowledge of airflow. If you are just starting out. I would suggest you first get familiar with airflow. You can try this link This blog entry is divided into Pushing values to XCOM Viewing XCOM values in Airflow UI Pulling XCOM values Pushing values to XCOM Before we dive headlong into XCOM, let’s see where to … Read more

Airflow – Scale-out with Redis and Celery

Introduction This post uses Redis and celery to scale-out airflow. Redis is a simple caching server and scales out quite well. It can be made resilient by deploying it as a cluster. In my previous post, the airflow scale-out was done using celery with rabbitmq as the message broker. On the whole, I found the idea of maintaining a rabbitmq a bit fiddly unless you happen to be an expert in rabbitmq. Redis seems to be a better solution when compared to rabbitmq. On the whole, it is a lot easier to deploy and maintain when compared with the various steps taken to deploy a RabbitMQ broker. In a nutshell, I like it more than RabbitMQ! To create an infrastructure like this we need to do the following steps Install & Configure Redis server on a separate host – 1 server Install & Configure Airflow with Redis and Celery Executor … Read more

Airflow – Scale out with RabbitMQ and Celery

Introduction Airflow Queues and workers are required if there is a need to make the airflow infrastructure more flexible and resilient. This blog entry describes how to install/setup/configure additional components so that we can use airflow in a more flexible and resilient manner. Till now we have used a local executor to execute all our jobs. Which works fine if we have a small number of jobs and they are not running when another job is running. However, in the real world, this is always the case. Additionally, we also need to take care of the possibilities of an airflow local executor becoming unavailable. Airflow queues are like any other queues and use a messaging system – like RabbitMQ, ActiveMQ. Airflow scheduler sends tasks as messages to the queues and hence acts as a publisher. Airflow workers are configured to listen for events(i.e tasks) coming on particular queues and execute … Read more

Airflow – Connections

Introduction We understand that airflow is a workflow/job orchestration engine and can execute various tasks by connecting to our environments. There are various ways to connect to an environment. One needs is connection details about that environment to connect to. For example Postgres DB – Hostname, Port, Schema SSH – Hostname which allows SSH connections. The list may extend to AWS, Google Cloud or Azure as well. But all of them need to some sort of connection information. Airflow is no different and needs connection information. One only needs to log in and goto Admin->Connections to see the exhaustive list of connections which are possible. Airflow allows for various connections and some of the documentation can be found on this link. You can see the various types of connections which can be made by airflow. This makes connecting to different types of technologies so easy by airflow. Keep in mind … Read more

Airflow – Sub-DAGs

Introduction Workflows can quickly become quite complicated which makes them difficult to understand. To reduce complexity tasks can be grouped together into smaller DAGs and then called. This makes DAGs easier to manage and maintain. Creating a sub dag consists of Creating the actual Sub DAG and testing it. Create a DAG which calls the Sub-DAG created in Step1. For those of you wondering how to call a sub-dag. It is easy using the SubDagOperator. In the next example, we would re-use one of our earlier examples as a Sub-DAG and call it from another DAG. Create a Sub-DAG Let’s first see the code of the Sub-DAG. You will observe that it is very similar to a DAG from the previous entries except that it is called inside a function. See Below Before moving forward a few important points Airflow Sub DAG is in a separate file in the same … Read more

Airflow Branch joins

Introduction In many use cases, there is a requirement of having different branches(see blog entry) in a workflow. However, these branches can also join together and execute a common task. Something similar to the pic below There is no special operator which is used but only the way we assign & set upstream and downstream relationships between the tasks. An example of this is shown below. Branch Join DAG The hello_joins example extends a DAG from the previous blog entry. An additional task is added to the dag which has been quite imaginatively named join_task. Understand DAG Let’s look at the changes introduced in the DAG. Step 5 – A new task called join_task was added. Observe the TriggerRule which has been added. It is set to ONE_SUCCESS which means that if any one of the preceding tasks has been successful join_task should be executed. Step 6 – Adds the … Read more

Airflow – Variables

In the last entry on airflow(which was many moons ago!) we had created a simple DAG and executed it. It was simple to get us started. However, it did not allow us any flexibility. We could not change the behaviour of the DAG. In the real world scenarios, we may need to change the behaviour of our workflow based on certain parameters. This is accomplished by Airflow Variables Airflow Variables are simple key-value pairs which are stored in the database which holds the airflow metadata. These variables can be created & managed via the airflow UI or airflow CLI. Airflow WebUI -> Admin -> Variables Some of the features of Airflow variables are below Can be defined as a simple key-value pair One variable can hold a list of key-value pairs as well! Stored in airflow database which holds the metadata Can be used in the Airflow DAG code as … Read more

Airflow – Branching

Introduction Branching is a useful concept when creating workflows. Simply speaking it is a way to implement if-then-else logic in airflow. For example, there may be a requirement to execute a certain task(s) only when a particular condition is met. This is achieved via airflow branching. If the condition is true, certain task(s) are executed and if the condition is false, different task(s) are executed. Branching is achieved by implementing an Airflow operator called the BranchPythonOperator. To keep it simple – it is essentially, an API which implements a task. This task then calls a simple method written in python – whose only job is to implement an if-then-else logic and return to airflow the name of the next task to execute. Branching Let’s take an example. We want to check if the value of a variable is greater than 10. If the value is greater than or equal 15 … Read more