Airflow can be set up behind a reverse proxy, with the ability to set its endpoint with great flexibility.
For example, you can configure your reverse proxy to get:
https://lab.mycompany.com/myorg/airflow/
To do so, you need to set the following setting in your airflow.cfg:
base_url = http://my_host/myorg/airflow
Additionally if you use Celery Executor, you can get Flower in /myorg/flower with:
flower_url_prefix = /myorg/flower
Your reverse proxy (ex: nginx) should be configured as follow:
pass the url and http header as it for the Airflow webserver, without any rewrite, for example:
server { listen 80; server_name lab.mycompany.com; location /myorg/airflow/ { proxy_pass http://localhost:8080; proxy_set_header Host $host; proxy_redirect off; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; } }
rewrite the url for the flower endpoint:
server { listen 80; server_name lab.mycompany.com; location /myorg/flower/ { rewrite ^/myorg/flower/(.*)$ /$1 break; # remove prefix from http header proxy_pass http://localhost:5555; proxy_set_header Host $host; proxy_redirect off; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; } }
To ensure that Airflow generates URLs with the correct scheme when running behind a TLS-terminating proxy, you should configure the proxy to set the X-Forwarded-Proto header, and enable the ProxyFix middleware in your airflow.cfg:
enable_proxy_fix = True
Note: you should only enable the ProxyFix middleware when running Airflow behind a trusted proxy (AWS ELB, nginx, etc.).
Airflow has limited support for Microsoft Azure: interfaces exist only for Azure Blob Storage and Azure Data Lake. Hook, Sensor and Operator for Blob Storage and Azure Data Lake Hook are in contrib section.
All classes communicate via the Window Azure Storage Blob protocol. Make sure that a Airflow connection of type wasb exists. Authorization can be done by supplying a login (=Storage account name) and password (=KEY), or login and SAS token in the extra field (see connection wasb_default for an example).
- :ref:`WasbBlobSensor`: Checks if a blob is present on Azure Blob storage.
- :ref:`WasbPrefixSensor`: Checks if blobs matching a prefix are present on Azure Blob storage.
- :ref:`FileToWasbOperator`: Uploads a local file to a container as a blob.
- :ref:`WasbHook`: Interface with Azure Blob Storage.
.. autoclass:: airflow.contrib.sensors.wasb_sensor.WasbBlobSensor
.. autoclass:: airflow.contrib.sensors.wasb_sensor.WasbPrefixSensor
.. autoclass:: airflow.contrib.operators.file_to_wasb.FileToWasbOperator
.. autoclass:: airflow.contrib.hooks.wasb_hook.WasbHook
Cloud variant of a SMB file share. Make sure that a Airflow connection of type wasb exists. Authorization can be done by supplying a login (=Storage account name) and password (=Storage account key), or login and SAS token in the extra field (see connection wasb_default for an example).
.. autoclass:: airflow.contrib.hooks.azure_fileshare_hook.AzureFileShareHook
Airflow can be configured to read and write task logs in Azure Blob Storage. See :ref:`write-logs-azure`.
AzureDataLakeHook communicates via a REST API compatible with WebHDFS. Make sure that a Airflow connection of type azure_data_lake exists. Authorization can be done by supplying a login (=Client ID), password (=Client Secret) and extra fields tenant (Tenant) and account_name (Account Name)
(see connection azure_data_lake_default for an example).
- :ref:`AzureDataLakeHook`: Interface with Azure Data Lake.
.. autoclass:: airflow.contrib.hooks.azure_data_lake_hook.AzureDataLakeHook
Airflow has extensive support for Amazon Web Services. But note that the Hooks, Sensors and Operators are in the contrib section.
- :ref:`EmrAddStepsOperator` : Adds steps to an existing EMR JobFlow.
- :ref:`EmrCreateJobFlowOperator` : Creates an EMR JobFlow, reading the config from the EMR connection.
- :ref:`EmrTerminateJobFlowOperator` : Terminates an EMR JobFlow.
- :ref:`EmrHook` : Interact with AWS EMR.
.. autoclass:: airflow.contrib.operators.emr_add_steps_operator.EmrAddStepsOperator
.. autoclass:: airflow.contrib.operators.emr_create_job_flow_operator.EmrCreateJobFlowOperator
.. autoclass:: airflow.contrib.operators.emr_terminate_job_flow_operator.EmrTerminateJobFlowOperator
.. autoclass:: airflow.contrib.hooks.emr_hook.EmrHook
- :ref:`S3Hook` : Interact with AWS S3.
- :ref:`S3FileTransformOperator` : Copies data from a source S3 location to a temporary location on the local filesystem.
- :ref:`S3ListOperator` : Lists the files matching a key prefix from a S3 location.
- :ref:`S3ToGoogleCloudStorageOperator` : Syncs an S3 location with a Google Cloud Storage bucket.
- :ref:`S3ToHiveTransfer` : Moves data from S3 to Hive. The operator downloads a file from S3, stores the file locally before loading it into a Hive table.
.. autoclass:: airflow.hooks.S3_hook.S3Hook
.. autoclass:: airflow.operators.s3_file_transform_operator.S3FileTransformOperator
.. autoclass:: airflow.contrib.operators.s3_list_operator.S3ListOperator
.. autoclass:: airflow.contrib.operators.s3_to_gcs_operator.S3ToGoogleCloudStorageOperator
.. autoclass:: airflow.operators.s3_to_hive_operator.S3ToHiveTransfer
- :ref:`ECSOperator` : Execute a task on AWS EC2 Container Service.
.. autoclass:: airflow.contrib.operators.ecs_operator.ECSOperator
- :ref:`AWSBatchOperator` : Execute a task on AWS Batch Service.
.. autoclass:: airflow.contrib.operators.awsbatch_operator.AWSBatchOperator
- :ref:`AwsRedshiftClusterSensor` : Waits for a Redshift cluster to reach a specific status.
- :ref:`RedshiftHook` : Interact with AWS Redshift, using the boto3 library.
- :ref:`RedshiftToS3Transfer` : Executes an unload command to S3 as CSV with or without headers.
- :ref:`S3ToRedshiftTransfer` : Executes an copy command from S3 as CSV with or without headers.
.. autoclass:: airflow.contrib.sensors.aws_redshift_cluster_sensor.AwsRedshiftClusterSensor
.. autoclass:: airflow.contrib.hooks.redshift_hook.RedshiftHook
.. autoclass:: airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer
.. autoclass:: airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer
Databricks has contributed an Airflow operator which enables
submitting runs to the Databricks platform. Internally the operator talks to the
api/2.0/jobs/runs/submit
endpoint.
.. autoclass:: airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator
Airflow has extensive support for the Google Cloud Platform. But note that most Hooks and Operators are in the contrib section. Meaning that they have a beta status, meaning that they can have breaking changes between minor releases.
See the :ref:`GCP connection type <connection-type-GCP>` documentation to configure connections to GCP.
Airflow can be configured to read and write task logs in Google Cloud Storage. See :ref:`write-logs-gcp`.
- :ref:`BigQueryCheckOperator` : Performs checks against a SQL query that will return a single row with different values.
- :ref:`BigQueryValueCheckOperator` : Performs a simple value check using SQL code.
- :ref:`BigQueryIntervalCheckOperator` : Checks that the values of metrics given as SQL expressions are within a certain tolerance of the ones from days_back before.
- :ref:`BigQueryCreateEmptyTableOperator` : Creates a new, empty table in the specified BigQuery dataset optionally with schema.
- :ref:`BigQueryCreateExternalTableOperator` : Creates a new, external table in the dataset with the data in Google Cloud Storage.
- :ref:`BigQueryDeleteDatasetOperator` : Deletes an existing BigQuery dataset.
- :ref:`BigQueryCreateEmptyDatasetOperator` : Creates an empty BigQuery dataset.
- :ref:`BigQueryOperator` : Executes BigQuery SQL queries in a specific BigQuery database.
- :ref:`BigQueryToBigQueryOperator` : Copy a BigQuery table to another BigQuery table.
- :ref:`BigQueryToCloudStorageOperator` : Transfers a BigQuery table to a Google Cloud Storage bucket
.. autoclass:: airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator
.. autoclass:: airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator
.. autoclass:: airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator
.. autoclass:: airflow.contrib.operators.bigquery_get_data.BigQueryGetDataOperator
.. autoclass:: airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator
.. autoclass:: airflow.contrib.operators.bigquery_operator.BigQueryCreateExternalTableOperator
.. autoclass:: airflow.contrib.operators.bigquery_operator.BigQueryDeleteDatasetOperator
.. autoclass:: airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyDatasetOperator
.. autoclass:: airflow.contrib.operators.bigquery_operator.BigQueryOperator
.. autoclass:: airflow.contrib.operators.bigquery_table_delete_operator.BigQueryTableDeleteOperator
.. autoclass:: airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator
.. autoclass:: airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator
.. autoclass:: airflow.contrib.hooks.bigquery_hook.BigQueryHook :members:
- :ref:`GcfFunctionDeployOperator` : deploy Google Cloud Function to the cloud.
- :ref:`GcfFunctionDeleteOperator` : delete Google Cloud Function in the cloud.
.. autoclass:: airflow.contrib.operators.gcp_operator.GCP
.. autoclass:: airflow.contrib.operators.gcp_function_operator.GcfFunctionDeployOperator
.. autoclass:: airflow.contrib.operators.gcp_function_operator.GcfFunctionDeleteOperator
.. autoclass:: airflow.contrib.hooks.gcp_function_hook.GcfHook :members:
- :ref:`DataFlowJavaOperator` : launching Cloud Dataflow jobs written in Java.
- :ref:`DataflowTemplateOperator` : launching a templated Cloud DataFlow batch job.
- :ref:`DataFlowPythonOperator` : launching Cloud Dataflow jobs written in python.
.. autoclass:: airflow.contrib.operators.dataflow_operator.DataFlowJavaOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date':
(2016, 8, 1),
'email': ['alex@vanboxel.be'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=30),
'dataflow_default_options': {
'project': 'my-gcp-project',
'zone': 'us-central1-f',
'stagingLocation': 'gs://bucket/tmp/dataflow/staging/',
}
}
dag = DAG('test-dag', default_args=default_args)
task = DataFlowJavaOperator(
gcp_conn_id='gcp_default',
task_id='normalize-cal',
jar='{{var.value.gcp_dataflow_base}}pipeline-ingress-cal-normalize-1.0.jar',
options={
'autoscalingAlgorithm': 'BASIC',
'maxNumWorkers': '50',
'start': '{{ds}}',
'partitionType': 'DAY'
},
dag=dag)
.. autoclass:: airflow.contrib.operators.dataflow_operator.DataflowTemplateOperator
.. autoclass:: airflow.contrib.operators.dataflow_operator.DataFlowPythonOperator
.. autoclass:: airflow.contrib.hooks.gcp_dataflow_hook.DataFlowHook :members:
- :ref:`DataprocClusterCreateOperator` : Create a new cluster on Google Cloud Dataproc.
- :ref:`DataprocClusterDeleteOperator` : Delete a cluster on Google Cloud Dataproc.
- :ref:`DataprocClusterScaleOperator` : Scale up or down a cluster on Google Cloud Dataproc.
- :ref:`DataProcPigOperator` : Start a Pig query Job on a Cloud DataProc cluster.
- :ref:`DataProcHiveOperator` : Start a Hive query Job on a Cloud DataProc cluster.
- :ref:`DataProcSparkSqlOperator` : Start a Spark SQL query Job on a Cloud DataProc cluster.
- :ref:`DataProcSparkOperator` : Start a Spark Job on a Cloud DataProc cluster.
- :ref:`DataProcHadoopOperator` : Start a Hadoop Job on a Cloud DataProc cluster.
- :ref:`DataProcPySparkOperator` : Start a PySpark Job on a Cloud DataProc cluster.
- :ref:`DataprocWorkflowTemplateInstantiateOperator` : Instantiate a WorkflowTemplate on Google Cloud Dataproc.
- :ref:`DataprocWorkflowTemplateInstantiateInlineOperator` : Instantiate a WorkflowTemplate Inline on Google Cloud Dataproc.
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataprocClusterCreateOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataprocClusterScaleOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataprocClusterDeleteOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataProcPigOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataProcHiveOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataProcSparkSqlOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataProcSparkOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataProcHadoopOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataProcPySparkOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataprocWorkflowTemplateInstantiateOperator
.. autoclass:: airflow.contrib.operators.dataproc_operator.DataprocWorkflowTemplateInstantiateInlineOperator
- :ref:`DatastoreExportOperator` : Export entities from Google Cloud Datastore to Cloud Storage.
- :ref:`DatastoreImportOperator` : Import entities from Cloud Storage to Google Cloud Datastore.
.. autoclass:: airflow.contrib.operators.datastore_export_operator.DatastoreExportOperator
.. autoclass:: airflow.contrib.operators.datastore_import_operator.DatastoreImportOperator
.. autoclass:: airflow.contrib.hooks.datastore_hook.DatastoreHook :members:
- :ref:`MLEngineBatchPredictionOperator` : Start a Cloud ML Engine batch prediction job.
- :ref:`MLEngineModelOperator` : Manages a Cloud ML Engine model.
- :ref:`MLEngineTrainingOperator` : Start a Cloud ML Engine training job.
- :ref:`MLEngineVersionOperator` : Manages a Cloud ML Engine model version.
.. autoclass:: airflow.contrib.operators.mlengine_operator.MLEngineBatchPredictionOperator :members:
.. autoclass:: airflow.contrib.operators.mlengine_operator.MLEngineModelOperator :members:
.. autoclass:: airflow.contrib.operators.mlengine_operator.MLEngineTrainingOperator :members:
.. autoclass:: airflow.contrib.operators.mlengine_operator.MLEngineVersionOperator :members:
.. autoclass:: airflow.contrib.hooks.gcp_mlengine_hook.MLEngineHook :members:
- :ref:`FileToGoogleCloudStorageOperator` : Uploads a file to Google Cloud Storage.
- :ref:`GoogleCloudStorageCreateBucketOperator` : Creates a new cloud storage bucket.
- :ref:`GoogleCloudStorageListOperator` : List all objects from the bucket with the give string prefix and delimiter in name.
- :ref:`GoogleCloudStorageDownloadOperator` : Downloads a file from Google Cloud Storage.
- :ref:`GoogleCloudStorageToBigQueryOperator` : Loads files from Google cloud storage into BigQuery.
- :ref:`GoogleCloudStorageToGoogleCloudStorageOperator` : Copies objects from a bucket to another, with renaming if requested.
.. autoclass:: airflow.contrib.operators.file_to_gcs.FileToGoogleCloudStorageOperator
.. autoclass:: airflow.contrib.operators.gcs_operator.GoogleCloudStorageCreateBucketOperator
.. autoclass:: airflow.contrib.operators.gcs_download_operator.GoogleCloudStorageDownloadOperator
.. autoclass:: airflow.contrib.operators.gcs_list_operator.GoogleCloudStorageListOperator
.. autoclass:: airflow.contrib.operators.gcs_to_bq.GoogleCloudStorageToBigQueryOperator
.. autoclass:: airflow.contrib.operators.gcs_to_gcs.GoogleCloudStorageToGoogleCloudStorageOperator
.. autoclass:: airflow.contrib.hooks.gcs_hook.GoogleCloudStorageHook :members:
- :ref:`GKEClusterCreateOperator` : Creates a Kubernetes Cluster in Google Cloud Platform
- :ref:`GKEClusterDeleteOperator` : Deletes a Kubernetes Cluster in Google Cloud Platform
.. autoclass:: airflow.contrib.operators.gcp_container_operator.GKEClusterCreateOperator
.. autoclass:: airflow.contrib.operators.gcp_container_operator.GKEClusterDeleteOperator
.. autoclass:: airflow.contrib.operators.gcp_container_operator.GKEPodOperator
.. autoclass:: airflow.contrib.hooks.gcp_container_hook.GKEClusterHook :members:
Apache Airflow has a native operator and hooks to talk to Qubole, which lets you submit your big data jobs directly to Qubole from Apache Airflow.
.. autoclass:: airflow.contrib.operators.qubole_operator.QuboleOperator
.. autoclass:: airflow.contrib.sensors.qubole_sensor.QubolePartitionSensor
.. autoclass:: airflow.contrib.sensors.qubole_sensor.QuboleFileSensor