Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos #3924

Merged
merged 1 commit into from
Sep 21, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/concepts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -320,7 +320,7 @@ Connections
===========

The connection information to external systems is stored in the Airflow
metadata database and managed in the UI (``Menu -> Admin -> Connections``)
metadata database and managed in the UI (``Menu -> Admin -> Connections``).
A ``conn_id`` is defined there and hostname / login / password / schema
information attached to it. Airflow pipelines can simply refer to the
centrally managed ``conn_id`` without having to hard code any of this
Expand Down Expand Up @@ -353,15 +353,15 @@ See :doc:`howto/manage-connections` for how to create and manage connections.
Queues
======

When using the CeleryExecutor, the celery queues that tasks are sent to
When using the CeleryExecutor, the Celery queues that tasks are sent to
can be specified. ``queue`` is an attribute of BaseOperator, so any
task can be assigned to any queue. The default queue for the environment
is defined in the ``airflow.cfg``'s ``celery -> default_queue``. This defines
the queue that tasks get assigned to when not specified, as well as which
queue Airflow workers listen to when started.

Workers can listen to one or multiple queues of tasks. When a worker is
started (using the command ``airflow worker``), a set of comma delimited
started (using the command ``airflow worker``), a set of comma-delimited
queue names can be specified (e.g. ``airflow worker -q spark``). This worker
will then only pick up tasks wired to the specified queue(s).

Expand Down
2 changes: 1 addition & 1 deletion docs/howto/executor/use-celery.rst
Original file line number Diff line number Diff line change
Expand Up @@ -44,4 +44,4 @@ Some caveats:

- Make sure to use a database backed result backend
- Make sure to set a visibility timeout in [celery_broker_transport_options] that exceeds the ETA of your longest running task
- Tasks can and consume resources, make sure your worker as enough resources to run `worker_concurrency` tasks
- Tasks can consume resources. Make sure your worker has enough resources to run `worker_concurrency` tasks
6 changes: 3 additions & 3 deletions docs/howto/manage-connections.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Managing Connections

Airflow needs to know how to connect to your environment. Information
such as hostname, port, login and passwords to other systems and services is
handled in the ``Admin->Connection`` section of the UI. The pipeline code you
handled in the ``Admin->Connections`` section of the UI. The pipeline code you
will author will reference the 'conn_id' of the Connection objects.

.. image:: ../img/connections.png
Expand All @@ -17,7 +17,7 @@ more information.
Creating a Connection with the UI
---------------------------------

Open the ``Admin->Connection`` section of the UI. Click the ``Create`` link
Open the ``Admin->Connections`` section of the UI. Click the ``Create`` link
to create a new connection.

.. image:: ../img/connection_create.png
Expand All @@ -34,7 +34,7 @@ to create a new connection.
Editing a Connection with the UI
--------------------------------

Open the ``Admin->Connection`` section of the UI. Click the pencil icon next
Open the ``Admin->Connections`` section of the UI. Click the pencil icon next
to the connection you wish to edit in the connection list.

.. image:: ../img/connection_edit.png
Expand Down
2 changes: 1 addition & 1 deletion docs/howto/secure-connections.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ variable over the value in ``airflow.cfg``:
.. code-block:: bash
# Note the double underscores
EXPORT AIRFLOW__CORE__FERNET_KEY = your_fernet_key
export AIRFLOW__CORE__FERNET_KEY=your_fernet_key
4. Restart Airflow webserver.
5. For existing connections (the ones that you had defined before installing ``airflow[crypto]`` and creating a Fernet key), you need to open each connection in the connection admin UI, re-type the password, and save it.
3 changes: 2 additions & 1 deletion docs/kubernetes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@ Kubernetes Executor
The kubernetes executor is introduced in Apache Airflow 1.10.0. The Kubernetes executor will create a new pod for every task instance.

Example helm charts are available at `scripts/ci/kubernetes/kube/{airflow,volumes,postgres}.yaml` in the source distribution. The volumes are optional and depend on your configuration. There are two volumes available:
- Dags: by storing all the dags onto the persistent disks, all the workers can read the dags from there. Another option is using git-sync, before starting the container, a git pull of the dags repository will be performed and used throughout the lifecycle of the pod/

- Dags: by storing all the dags onto the persistent disks, all the workers can read the dags from there. Another option is using git-sync, before starting the container, a git pull of the dags repository will be performed and used throughout the lifecycle of the pod.
- Logs: by storing the logs onto a persistent disk, all the logs will be available for all the workers and the webserver itself. If you don't configure this, the logs will be lost after the worker pods shuts down. Another option is to use S3/GCS/etc to store the logs.


Expand Down
2 changes: 1 addition & 1 deletion docs/ui.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
UI / Screenshots
=================
The Airflow UI make it easy to monitor and troubleshoot your data pipelines.
The Airflow UI makes it easy to monitor and troubleshoot your data pipelines.
Here's a quick overview of some of the features and visualizations you
can find in the Airflow UI.

Expand Down