forked from apache/airflow
-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Also use ODBC connection for sqlalchemy engine in OdbcHook like JdbcHook #5
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* Migrate the public endpoint Get DAG to FastAPI * Use proper name for test function
* get dag_run init * add serializer * Merge branch 'main' of https://github.com/apache/airflow into kalyan/AIP-84/get_dag_run * add types * add test * working tests * add note to DagRunResponse * add note * add test to test non Null note * Update airflow/api_fastapi/views/public/dag_run.py Co-authored-by: Pierre Jeambrun <pierrejbrun@gmail.com> * Update airflow/api_fastapi/views/public/dag_run.py Co-authored-by: Pierre Jeambrun <pierrejbrun@gmail.com> * Merge branch 'main' of https://github.com/apache/airflow into kalyan/AIP-84/get_dag_run * add 404 test --------- Co-authored-by: Pierre Jeambrun <pierrejbrun@gmail.com>
Co-authored-by: Majoros Donat (XC-DX/EET2-Bp) <donat.majoros2@hu.bosch.com>
This actually does a little bit more. It changes the backfill create endpoint to take a json payload instead of just query params. This is just easier because we can use the backfill schema as the schema of the request body. One thing that is maybe weird is I add a decorator to translate the request body to the kwargs in the endpoint function. The main motivator here was for compatibility with the requires_access_dag decorator, which doesn't check request body.
This method uses Backfill internally. Before we can remove BackfillJobRunner, we need to remove DAG.run. But before we can remove DAG.run, we need to update some old tests that use it. So this is the first step towards removing BackflilJobRunner. There were some very old tests that came from airflow github issue 1225. These appeared to test the scheduler but really they tested the backfill job runner. Just to be cautious, I kept most of them rather than remove (which probably would have been fine since they essentially tested code that we'll be removing). As appropriate I either changed them to run on dag.test or scheduler. The ones dealing with ignore first depends on past will have to be added back when that functionality is implemented in new backfill.
----- Co-authored-by: Kaxil Naik <kaxilnaik@gmail.com>
A couple of minor typos due to newlines
* Add possibility to override the conn type for Druid Minor fix, which allows to use the schema which are specified in theschema rather than `http` as default. In the same time it doesn't changethe logic as any conn_type can be selected. Intuitevely it's expectedthat anything specified in `schema` field will actually take precedencein the building the desired url. * Add druid endpoint connection from another PR * Fix missing scheme in test * Set schema to None where it's unused Even though we don't need it directly set, by default the mock will set it to an internal object, thus we need to override it to None. --------- Co-authored-by: Oleg Auckenthaler <github.sitcom838@passmail.net>
* Fix dag display name search * Fix CI
…pache#42921) * check _is_canary_run condition in is_legacy_ui_api_labeled method * include pr check in is_legacy_ui_api_labeled
I'm not 100% sure why we added this check in the first place, but it doesn't seem to be needed anymore (I've tested things locally with this removed and it all seems to behave itself)
* Render errors when getting a list of dags * Restore axios, prettierignore pnpm-store * Add pnpm-store to prettier ignore
This will make sure we don't receive any information about the Webserver URL sending the info like the number of plugins and such.
…roject (apache#42505) (apache#42624) This is only a partial split so far. It moves all the code and tests, but leaves the creation of `core/` to a separate PR as this is already large enough. In addition to the straight file rename the other changes I had to make here are: - Some mypy/typing fixes. Mypy can be fragile about what it picks up when, so maybe some of those changes were caused by that. But the typing changes aren't large. - Improve typing in common.sql type stub Again, likely a mypy file oddity, but the types should be safe - Removed the `check-providers-init-file-missing` check This isn't needed now that airflow/providers shouldn't exist at all in the main tree. - Create a "dev.tests_common" package that contains helper files and common pytest fixtures Since the provider tests are no longer under tests/ they don't automatically share the fixtures from the parent `tests/conftest.py` so they needed extracted. Ditto for `tests.test_utils` -- they can't be easily imported in provider tests anymore, so they are moved to a more explicit shared location. In future we should switch how the CI image is built to make better use of UV caching than our own approach as that would remvoe a lot of custom code. Co-authored-by: Ash Berlin-Taylor <ash@apache.org> Co-authored-by: Ryan Hatter <25823361+RNHTTR@users.noreply.github.com>
Follow up after apache#42766 and apache#42936. * We do not have to check for minimum Python version for Python 3.9 any more (as we do not support 3.8 any more) * While Python 3.12 is not yet fully supported by Apache Beam, we should still not allow it for releasing providers, but all other commands should support Python 3.12 * When you had pre-commit installed with Python 3.8 before, various errors might appear when running pre-commit. Troubleshooting was added quoting the errors and explaining what to do.
grammar update for clarity 😄 👍
I create a new command group "backfill" for management of backfills. The first action is "create" which creates a backfill. Some others may follow such as pause / cancel.
These links were pointing to the wrong location. This PR fixes it
Now that astral-sh/uv#8236 is fixed (a bug in uv 0.4.22), we can remove `--no-sources` (used in `uv pip install`) introduced in apache#43056
* add min version to python-ldap * add min version to python-ldap
In the areas where this is used, we don't want to include backfill runs in the counts. Rather than rename the function to reflect the change, I add a parameter. https://github.com/orgs/apache/projects/408
This behavior change was accepted by lazy consensus here: https://lists.apache.org/thread/9o84d3yn934m32gtlpokpwtbbmtxj47l. Previously max_active_tasks was evaluated across all runs of a dag. Co-authored-by: Wei Lee <weilee.rx@gmail.com>
* provide example for access URL conn string * Apply suggestions from code review --------- Co-authored-by: Kaxil Naik <kaxilnaik@gmail.com>
* Add Executor to Edge Provider * Review feeedback + small adjustments from Niko * Add explicit note about non-performance in current state * Fix adoption of tasks when restarting scheduler (cherry picked from commit 4e9c2262f567a2511d02d4acd43b821fa0df45d2) * Optimize sync call for MVP executor * Adjust new folder structure from PR 42505 * Review feedback: removed cleanup_stuck_queued_tasks() and added notes about performance tests
…3101) So far `pip` and `uv` version change caused clean reinstallation of the CI image `pip` dependencies - including the cached layer from main - which pre-installed airlfow to speed up reinstallation of "final" airflow package. However since we started to update `uv` more frequently - those frequent rebuilds are ... to costly (about 3 minutes extra when uv version changed. This change implements optimization of this workflow - the main cache installation is done using LATEST uv or pip and only after that UV and PIP get reinstalled to the fixed version specified in Dockerfile. Related: apache#42999
Allow SqlSensor to inspect the entire result row by adding a selector field. This is useful to customize the success/failure criteria instead of just the first cell. Co-authored-by: Jasmin <jasmin.a.patel13@gmail.com>
* wip * wip * fix lint err --------- Co-authored-by: venkat <venkat@venkats-MacBook-Pro.local>
…41731) * Breeze adjustments for introduction of AIP-69 Remote Executor * Rename Remote Executor to Edge Executor
Ignore "depends_on_past" for first run in a backfill This implements this pre-AIP-78 behavior in AIP-78 backfill logic. Depends on apache#42684
Follow-up of apache#42051 . After merging to main, I built docs for edge executor and pushed the inventory file to s3, so it is Airflow docs and can understand where the cross-ref is coming from since it downloads the inventory file.
* add min version to plyvel * update breeze tests
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Same as with PR 42705, we need to make sure the SQLAlchemy engine uses the same ODBC connection when creating the engine in the OdbcHook as the OdbcHook is also agnostic of which database it connects to, just like with the JdbcHook.
^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named
{pr_number}.significant.rst
or{issue_number}.significant.rst
, in newsfragments.