Skip to content

Commit 13a30ac

Browse files
XD-DENGashb
authored andcommitted
[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (apache#3924)
1 parent 8807071 commit 13a30ac

File tree

6 files changed

+11
-10
lines changed

6 files changed

+11
-10
lines changed

docs/concepts.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -316,7 +316,7 @@ Connections
316316
===========
317317

318318
The connection information to external systems is stored in the Airflow
319-
metadata database and managed in the UI (``Menu -> Admin -> Connections``)
319+
metadata database and managed in the UI (``Menu -> Admin -> Connections``).
320320
A ``conn_id`` is defined there and hostname / login / password / schema
321321
information attached to it. Airflow pipelines can simply refer to the
322322
centrally managed ``conn_id`` without having to hard code any of this
@@ -338,15 +338,15 @@ See :doc:`howto/manage-connections` for how to create and manage connections.
338338
Queues
339339
======
340340

341-
When using the CeleryExecutor, the celery queues that tasks are sent to
341+
When using the CeleryExecutor, the Celery queues that tasks are sent to
342342
can be specified. ``queue`` is an attribute of BaseOperator, so any
343343
task can be assigned to any queue. The default queue for the environment
344344
is defined in the ``airflow.cfg``'s ``celery -> default_queue``. This defines
345345
the queue that tasks get assigned to when not specified, as well as which
346346
queue Airflow workers listen to when started.
347347

348348
Workers can listen to one or multiple queues of tasks. When a worker is
349-
started (using the command ``airflow worker``), a set of comma delimited
349+
started (using the command ``airflow worker``), a set of comma-delimited
350350
queue names can be specified (e.g. ``airflow worker -q spark``). This worker
351351
will then only pick up tasks wired to the specified queue(s).
352352

docs/howto/executor/use-celery.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -44,4 +44,4 @@ Some caveats:
4444

4545
- Make sure to use a database backed result backend
4646
- Make sure to set a visibility timeout in [celery_broker_transport_options] that exceeds the ETA of your longest running task
47-
- Tasks can and consume resources, make sure your worker as enough resources to run `worker_concurrency` tasks
47+
- Tasks can consume resources. Make sure your worker has enough resources to run `worker_concurrency` tasks

docs/howto/manage-connections.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Managing Connections
33

44
Airflow needs to know how to connect to your environment. Information
55
such as hostname, port, login and passwords to other systems and services is
6-
handled in the ``Admin->Connection`` section of the UI. The pipeline code you
6+
handled in the ``Admin->Connections`` section of the UI. The pipeline code you
77
will author will reference the 'conn_id' of the Connection objects.
88

99
.. image:: ../img/connections.png
@@ -17,7 +17,7 @@ more information.
1717
Creating a Connection with the UI
1818
---------------------------------
1919

20-
Open the ``Admin->Connection`` section of the UI. Click the ``Create`` link
20+
Open the ``Admin->Connections`` section of the UI. Click the ``Create`` link
2121
to create a new connection.
2222

2323
.. image:: ../img/connection_create.png
@@ -34,7 +34,7 @@ to create a new connection.
3434
Editing a Connection with the UI
3535
--------------------------------
3636

37-
Open the ``Admin->Connection`` section of the UI. Click the pencil icon next
37+
Open the ``Admin->Connections`` section of the UI. Click the pencil icon next
3838
to the connection you wish to edit in the connection list.
3939

4040
.. image:: ../img/connection_edit.png

docs/howto/secure-connections.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ variable over the value in ``airflow.cfg``:
2626
.. code-block:: bash
2727
2828
# Note the double underscores
29-
EXPORT AIRFLOW__CORE__FERNET_KEY = your_fernet_key
29+
export AIRFLOW__CORE__FERNET_KEY=your_fernet_key
3030
3131
4. Restart Airflow webserver.
3232
5. For existing connections (the ones that you had defined before installing ``airflow[crypto]`` and creating a Fernet key), you need to open each connection in the connection admin UI, re-type the password, and save it.

docs/kubernetes.rst

+2-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,8 @@ Kubernetes Executor
44
The kubernetes executor is introduced in Apache Airflow 1.10.0. The Kubernetes executor will create a new pod for every task instance.
55

66
Example helm charts are available at `scripts/ci/kubernetes/kube/{airflow,volumes,postgres}.yaml` in the source distribution. The volumes are optional and depend on your configuration. There are two volumes available:
7-
- Dags: by storing all the dags onto the persistent disks, all the workers can read the dags from there. Another option is using git-sync, before starting the container, a git pull of the dags repository will be performed and used throughout the lifecycle of the pod/
7+
8+
- Dags: by storing all the dags onto the persistent disks, all the workers can read the dags from there. Another option is using git-sync, before starting the container, a git pull of the dags repository will be performed and used throughout the lifecycle of the pod.
89
- Logs: by storing the logs onto a persistent disk, all the logs will be available for all the workers and the webserver itself. If you don't configure this, the logs will be lost after the worker pods shuts down. Another option is to use S3/GCS/etc to store the logs.
910

1011

docs/ui.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
UI / Screenshots
22
=================
3-
The Airflow UI make it easy to monitor and troubleshoot your data pipelines.
3+
The Airflow UI makes it easy to monitor and troubleshoot your data pipelines.
44
Here's a quick overview of some of the features and visualizations you
55
can find in the Airflow UI.
66

0 commit comments

Comments
 (0)