defaulting to the default_timezone in the global config. Another problem is that the support for param validation assumes JSON. (#22898), Fix pre-upgrade check for rows dangling w.r.t. The method set_dag_runs_state is no longer needed after a bug fix in PR: #15382. Previously not all hooks and operators related to Google Cloud use If you have this issue please report it on the mailing list. custom-auth backend based on If you want to run query to filter data between 2 dates then you have to format the string data into from_iso8601_timestamp format and then type cast using date function. Raise deep scheduler exceptions to force a process restart. The REMOTE_BASE_LOG_FOLDER key is not used anymore. If you are using Python 2.7, ensuring that any __init__.py files exist so that it is importable. If you set it to true (default) Airflow In the PubSubPublishOperator and PubSubHook.publsh method the data field in a message should be bytestring (utf-8 encoded) rather than base64 encoded string. other parameters are ignored. (#25795) Allow per-timetable ordering override in grid view (#25633) Grid logs for mapped instances (#25610, #25621, #25611) Consolidate to one schedule param (#25410) DAG regex flag in backfill command (#23870) value, what happens if you need to add more information, such as the API endpoint, or credentials? the stable REST API, set enable_experimental_api option in [api] section to True. (#13923), Fix invalid value error caused by long Kubernetes pod name (#13299), Fix DB Migration for SQLite to upgrade to 2.0 (#13921), Bugfix: Manual DagRun trigger should not skip scheduled runs (#13963), Stop loading Extra Operator links in Scheduler (#13932), Added missing return parameter in read function of FileTaskHandler (#14001), Bugfix: Do not try to create a duplicate Dag Run in Scheduler (#13920), Make v1/config endpoint respect webserver expose_config setting (#14020), Disable row level locking for Mariadb and MySQL <8 (#14031), Bugfix: Fix permissions to triggering only specific DAGs (#13922), Bugfix: Scheduler fails if task is removed at runtime (#14057), Remove permissions to read Configurations for User and Viewer roles (#14067), Increase the default min_file_process_interval to decrease CPU Usage (#13664), Dispose connections when running tasks with os.fork & CeleryExecutor (#13265), Make function purpose clearer in example_kubernetes_executor example dag (#13216), Remove unused libraries - flask-swagger, funcsigs (#13178), Display alternative tooltip when a Task has yet to run (no TI) (#13162), User werkzeugs own type conversion for request args (#13184), UI: Add queued_by_job_id & external_executor_id Columns to TI View (#13266), Make json-merge-patch an optional library and unpin it (#13175), Adds missing LDAP extra dependencies to ldap provider. column data type in the table specification. Add already checked to failed pods in K8sPodOperator (#11368), Pass SQLAlchemy engine options to FAB based UI (#11395), [AIRFLOW-4438] Add Gzip compression to S3_hook (#8571), Add permission extra_links for Viewer role and above (#10719), Add generate_yaml command to easily test KubernetesExecutor before deploying pods (#10677), Add Secrets backend for Microsoft Azure Key Vault (#10898), SkipMixin: Handle empty branches (#11120), [AIRFLOW-5274] dag loading duration metric name too long (#5890), Handle no Dagrun in DagrunIdDep (#8389) (#11343), Fix Kubernetes Executor logs for long dag names (#10942), Add on_kill support for the KubernetesPodOperator (#10666), KubernetesPodOperator template fix (#10963), Fix displaying of add serialized_dag table migration, Fix Start Date tooltip on DAGs page (#10637), URL encode execution date in the Last Run link (#10595), Fixes issue with affinity backcompat in Airflow 1.10, Fix KubernetesExecutor import in views.py, Fix Entrypoint and _CMD config variables (#12411), Fix operator field update for SerializedBaseOperator (#10924), Limited cryptography to < 3.2 for Python 2.7, Install cattr on Python 3.7 - Fix docs build on RTD (#12045), Pin kubernetes to a max version of 11.0.0 (#11974), Use snakebite-py3 for HDFS dependency for Python3 (#12340), Removes snakebite kerberos dependency (#10865), Fix failing dependencies for FAB and Celery (#10828), Fix pod_mutation_hook for 1.10.13 (#10850), Fix Logout Google Auth issue in Non-RBAC UI (#11890), Show Generic Error for Charts & Query View in old UI (#12495), TimeSensor should respect the default_timezone config (#9699), TimeSensor should respect DAG timezone (#9882), Unify user session lifetime configuration (#11970), Handle outdated webserver session timeout gracefully. This means administrators must opt-in to expose tracebacks to end users. Now that the DAG parser syncs DAG permissions there is no longer a need for manually refreshing DAGs. to_geojson_geometry(SphericalGeography) In Airflow < 2.0 you imported those two methods like this: BranchPythonOperator will now return a value equal to the task_id of the chosen branch, If you need to read logs, you can use airflow.utils.log.log_reader.TaskLogReader class, which does not have This behavior is problematic because to override these values in a dag run conf, you must use JSON, which could make these params non-overridable. Fix module path of send_email_smtp in configuration, Fix SSHExecuteOperator crash when using a custom ssh port, Add note about Airflow components to template, Make SchedulerJob not run EVERY queued task, Improve BackfillJob handling of queued/deadlocked tasks, Introduce ignore_depends_on_past parameters, Rename user table to users to avoid conflict with postgres, Add support for calling_format from boto to S3_Hook, Add PyPI meta data and sync version number, Set dags_are_paused_at_creations default value to True, Resurface S3Log class eaten by rebase/push -f, Add missing session.commit() at end of initdb, Validate that subdag tasks have pool slots available, and test, Use urlparse for remote GCS logs, and add unit tests, Make webserver worker timeout configurable, Use psycopg2s API for serializing postgres cell values, Make the provide_session decorator more robust, use num_shards instead of partitions to be consistent with batch ingestion, Update docs with separate configuration section, Fix airflow.utils deprecation warning code being Python 3 incompatible, Extract dbapi cell serialization into its own method, Set Postgres autocommit as supported only if server version is < 7.4, Use refactored utils module in unit test imports, remove unused logging,errno, MiniHiveCluster imports, Refactoring utils into smaller submodules, Properly measure number of task retry attempts, Add function to get configuration as dict, plus unit tests, Merge branch master into hivemeta_sasl, [hotfix] make email.Utils > email.utils for py3, Add the missing Date header to the warning e-mails, Check name of SubDag class instead of class itself, [hotfix] removing repo_token from .coveralls.yml, Add unit tests for trapping Executor errors, Fix HttpOpSensorTest to use fake request session, Add an example on pool usage in the documentation. extras at all. Instead, it now accepts: table - will render the output in predefined table. Cube's caching layer uses refreshKey queries to get the current version of content for a specific cube. are not supported. MONTH). overflow in Athena engine version 2, some dates produced a negative timestamp. contains_sequence(x, seq) Returns true Use Amazon Athena Federated Query to connect data sources. example if the airflowignore file contained x, and the dags folder was /var/x/dags, then all dags in Now, invalid arguments will be rejected. Expecting: 'TIMESTAMP', 'VERSION'. Change python3 as Dataflow Hooks/Operators default interpreter. Suggested solution: Use geospatial functions to UniqueContributors (float) --The number of unique contributors who published data during this timestamp. In previous versions of Airflow it was possible to use plugins to load custom executors. behaviour is still achievable setting param success to lambda x: x is None or str(x) not in ('0', ''). (#6627), [AIRFLOW-4145] Allow RBAC roles permissions, ViewMenu to be over-rideable (#4960), [AIRFLOW-5928] Hive hooks load_file short circuit (#6582), [AIRFLOW-5313] Add params support for awsbatch_operator (#5900), [AIRFLOW-2227] Add delete method to Variable class (#4963), [AIRFLOW-5082] Add subject in AwsSnsHook (#5694), [AIRFLOW-5715] Make email, owner context available (#6385), [AIRFLOW-5345] Allow SqlSensors hook to be customized by subclasses (#5946), [AIRFLOW-5417] Fix DB disconnects during webserver startup (#6023), [AIRFLOW-5730] Enable get_pandas_df on PinotDbApiHook (#6399), [AIRFLOW-3235] Add list function in AzureDataLakeHook (#4070), [AIRFLOW-5442] implementing get_pandas_df method for druid broker hook (#6057), [AIRFLOW-5883] Improve count() queries in a few places (#6532), [AIRFLOW-5811] Add metric for externally killed task count (#6466), [AIRFLOW-5758] Support the custom cursor classes for the PostgreSQL hook (#6432), [AIRFLOW-5766] Use httpbin.org in http_default (#6438), [AIRFLOW-5798] Set default ExternalTaskSensor.external_task_id (#6431), [AIRFLOW-5643] Reduce duplicated logic in S3Hook (#6313), [AIRFLOW-5562] Skip grant single DAG permissions for Admin role. find processing errors go the child_process_log_directory which defaults to /scheduler/latest. If user provides run_type and execution_date then run_id is constructed as minute(x) Returns the minute of the hour from x. month(x) Returns the month of the year from x. now() This is an alias for current_timestamp. null in the conn_type column. There are several Oracle datatypes that store date/time related data in a BINARY format and that also store fractional seconds and timezone info. configure a backend secret, it also means the webserver doesnt need to connect to it. Bugfix: TypeError when Serializing & sorting iterable properties of DAGs (#15395), Fix missing on_load trigger for folder-based plugins (#15208), kubernetes cleanup-pods subcommand will only clean up Airflow-created Pods (#15204), Fix password masking in CLI action_logging (#15143), Fix url generation for TriggerDagRunOperatorLink (#14990), Unable to trigger backfill or manual jobs with Kubernetes executor. This section describes the changes that have been made, and what you need to do to update your Python files. values that were trimmed in Athena engine version 2 are rounded in Athena engine version 3. Amid rising prices and economic uncertaintyas well as deep partisan divisions over social and political issuesCalifornians are processing a great deal of information to help them choose state constitutional officers and state because in the Airflow codebase we should not allow hooks to misuse the Connection.extra field in this way. The previous default setting was to allow all API requests without authentication, but this poses security year to month (for example, SELECT TIME '01:00' + INTERVAL '3' parameters to hook can only be passed via keyword arguments. From this version on the operator will only skip direct downstream tasks and the scheduler will handle skipping any further downstream dependencies. For example: /admin/connection becomes /connection/list, /admin/connection/new becomes /connection/add, /admin/connection/edit becomes /connection/edit, etc. Currently, # there are other log format and level configurations in. If you want to run query to filter data between 2 dates then you have to format the string data into from_iso8601_timestamp format and then type cast using date function. Check if hook is instance of DbApiHook. From Airflow 3.0, the extra field in airflow connections must be a JSON-encoded Python dict. Previously the command line option num_runs was used to let the scheduler terminate after a certain amount of (#4279), [AIRFLOW-3411] Add OpenFaaS hook (#4267), [AIRFLOW-2785] Add context manager entry points to mongoHook, [AIRFLOW-2524] Add SageMaker doc to AWS integration section (#4278), [AIRFLOW-3479] Keeps records in Log Table when DAG is deleted (#4287), [AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793), [AIRFLOW-2245] Add remote_host of SSH/SFTP operator as templated field (#3765), [AIRFLOW-2670] Update SSH Operators Hook to respect timeout (#3666), [AIRFLOW-3380] Add metrics documentation (#4219), [AIRFLOW-3361] Log the task_id in the PendingDeprecationWarning from BaseOperator (#4030), [AIRFLOW-3213] Create ADLS to GCS operator (#4134), [AIRFLOW-3395] added the REST API endpoints to the doc (#4236), [AIRFLOW-3294] Update connections form and integration docs (#4129), [AIRFLOW-3236] Create AzureDataLakeStorageListOperator (#4094), [AIRFLOW-3306] Disable flask-sqlalchemy modification tracking. The fernet mechanism is enabled by default to increase the security of the default installation. Airflow <=2.0.1. To clean up, the following packages were moved: airflow.providers.google.cloud.log.gcs_task_handler, airflow.providers.microsoft.azure.log.wasb_task_handler, airflow.utils.log.stackdriver_task_handler, airflow.providers.google.cloud.log.stackdriver_task_handler, airflow.providers.amazon.aws.log.s3_task_handler, airflow.providers.elasticsearch.log.es_task_handler, airflow.utils.log.cloudwatch_task_handler, airflow.providers.amazon.aws.log.cloudwatch_task_handler. Note that dag_run.run_type is a more authoritative value for this purpose. INTERSECT ALL Added support for minute of the time zone offset from timestamp. to be updated as follows: AwsBatchOperator().jobId -> AwsBatchOperator().job_id, AwsBatchOperator().jobName -> AwsBatchOperator().job_name. Tests have been adjusted. Airflow dag home page is now /home (instead of /admin). and some of them may be breaking. FABs built-in authentication support must be reconfigured. (picking up from jthomas123), Make sure paths dont conflict bc of trailing /, Refactor remote log read/write and add GCS support, Only use multipart upload in S3Hook if file is large enough. (#16718), Fix calculating duration in tree view (#16695), Fix AttributeError: datetime.timezone object has no attribute name (#16599), Redact conn secrets in webserver logs (#16579), Change graph focus to top of view instead of center (#16484), Fail tasks in scheduler when executor reports they failed (#15929), fix(smart_sensor): Unbound variable errors (#14774), Add back missing permissions to UserModelView controls. Error message: mismatched input Add a way to import Airflow without side-effects (#25832) Let timetables control generated run_ids. Hooks can define custom connection fields for their connection type by implementing method get_connection_form_widgets. convenience variables to the config. PROPERTIES. (Since this setting is used to calculate what config file to load, it is not Ec2SubnetId, TerminationProtection and KeepJobFlowAliveWhenNoSteps were all top-level keys when they https://cloud.google.com/compute/docs/disks/performance. Copy the contents to ${AIRFLOW_HOME}/config/airflow_local_settings.py, and alter the config as is preferred. The scheduler.min_file_parsing_loop_time config option has been temporarily removed due to (spherical) coordinates from geometric (planar) coordinates, as in the [AIRFLOW-1765] Make experimental API securable without needing Kerberos. If you wish to have the experimental API work, and aware of the risks of enabling this without authentication The functions of the standard library are more flexible and can be used in larger cases. This is configurable at the DAG level with max_active_tasks and a default can be set in airflow.cfg as If any other package imports Installing both Snowflake and Azure extra will result in non-importable Previously, a task instance with wait_for_downstream=True will only run if the downstream task of The current number CPU cores and threads PR: https://github.com/apache/airflow/pull/6317. The code that was in the contrib That user can only access / view the certain dags on the UI There is a report that the default of -1 for num_runs creates an issue where errors are reported while parsing tasks. See the Oracle docs about how to use TO_TIMESTAMP_TZ to convert strings to a format that includes those data elements. GROUPS Adds support for window frames the above restrictions. a GPL dependency. xcom_push of this value if do_xcom_push=True. to historical reasons. For example: Now if you resolve a Param without a default and dont pass a value, you will get an TypeError. data-aware scheduling. This directory is loaded by default. SSL support still works for WebHDFS hook. User can preserve/achieve the original behaviour by setting the trigger_rule of each downstream task to all_success. If you are using S3, the instructions should be largely the same as the Google cloud platform instructions above. For technical reasons, previously, when stored in the extra dict, the custom fields dict key had to take the form extra____. If you want to use LDAP auth backend without TLS then you will have to create a [AIRFLOW-3297] EmrStepSensor marks cancelled step as successful. Now the py_interpreter argument for DataFlow Hooks/Operators has been changed from python2 to python3. (#12332), Add XCom.deserialize_value to Airflow 1.10.13 (#12328), Mount airflow.cfg to pod_template_file (#12311), All k8s object must comply with JSON Schema (#12003), Validate Airflow chart values.yaml & values.schema.json (#11990), Pod template file uses custom custom env variable (#11480), Bump attrs and cattrs dependencies (#11969), [AIRFLOW-3607] Only query DB once per DAG run for TriggerRuleDep (#4751), Manage Flask AppBuilder Tables using Alembic Migrations (#12352), airflow test only works for tasks in 1.10, not whole dags (#11191), Improve warning messaging for duplicate task_ids in a DAG (#11126), DbApiHook: Support kwargs in get_pandas_df (#9730), Make grace_period_seconds option on K8sPodOperator (#10727), Fix syntax error in Dockerfile maintainer Label (#10899), The entrypoints in Docker Image should be owned by Airflow (#10853), Make dockerfiles Google Shell Guide Compliant (#10734), clean-logs script for Dockerfile: trim logs before sleep (#10685), When sending tasks to celery from a sub-process, reset signal handlers (#11278), SkipMixin: Add missing session.commit() and test (#10421), Webserver: Further Sanitize values passed to origin param (#12459), Security upgrade lodash from 4.17.19 to 4.17.20 (#11095), Log instead of raise an Error for unregistered OperatorLinks (#11959), Mask Password in Log table when using the CLI (#11468), [AIRFLOW-3607] Optimize dep checking when depends on past set and concurrency limit, Execute job cancel HTTPRequest in Dataproc Hook (#10361), Use rst lexer to format Airflow upgrade check output (#11259), Remove deprecation warning from contrib/kubernetes/pod.py, adding body as templated field for CloudSqlImportOperator (#10510), Change log level for Users session to DEBUG (#12414), Deprecate importing Hooks from plugin-created module (#12133), Deprecate adding Operators and Sensors via plugins (#12069), [Doc] Correct description for macro task_instance_key_str (#11062), Checks if all the libraries in setup.py are listed in installation.rst file (#12023), Move Project focus and Principles higher in the README (#11973), Remove archived link from README.md (#11945), Update download url for Airflow Version (#11800), Move Backport Providers docs to our docsite (#11136), Add missing images for kubernetes executor docs (#11083), Fix indentation in executor_config example (#10467), Enhanced the Kubernetes Executor doc (#10433), Refactor content to a markdown table (#10863), Rename Beyond the Horizon section and refactor content (#10802), Refactor official source section to use bullets (#10801), Add section for official source code (#10678), Add redbubble link to Airflow merchandise (#10359), README Doc: Link to Airflow directory in ASF Directory (#11137), Fix the default value for VaultBackends config_path (#12518). was the plugin name, where as in the second example it is the python module name where the operator is defined. (#6678), [AIRFLOW-5117] Automatically refresh EKS API tokens when needed (#5731), [AIRFLOW-5118] Add ability to specify optional components in DataprocClusterCreateOperator (#5821), [AIRFLOW-5681] Allow specification of a tag or hash for the git_sync init container (#6350), [AIRFLOW-6025] Add label to uniquely identify creator of Pod (#6621), [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator) (#5489), [AIRFLOW-5751] add get_uri method to Connection (#6426), [AIRFLOW-6056] Allow EmrAddStepsOperator to accept job_flow_name as alternative to job_flow_id (#6655), [AIRFLOW-2694] Declare permissions in DAG definition (#4642), [AIRFLOW-4940] Add DynamoDB to S3 operator (#5663), [AIRFLOW-4161] BigQuery to MySQL Operator (#5711), [AIRFLOW-6041] Add user agent to the Discovery API client (#6636), [AIRFLOW-6089] Reorder setup.py dependencies and add ci (#6681), [AIRFLOW-5921] Add bulk_load_custom to MySqlHook (#6575), [AIRFLOW-5854] Add support for tty parameter in Docker related operators (#6542), [AIRFLOW-4758] Add GcsToGDriveOperator operator (#5822), [AIRFLOW-3656] Show doc link for the current installed version (#6690), [AIRFLOW-5665] Add path_exists method to SFTPHook (#6344), [AIRFLOW-5729] Make InputDataConfig optional in Sagemakers training config (#6398), [AIRFLOW-5045] Add ability to create Google Dataproc cluster with custom image from a different project (#5752), [AIRFLOW-6132] Allow to pass in tags for the AzureContainerInstancesOperator (#6694), [AIRFLOW-5945] Make inbuilt OperatorLinks work when using Serialization (#6715), [AIRFLOW-5947] Make the JSON backend pluggable for DAG Serialization (#6630), [AIRFLOW-6239] Filter dags return by last_dagruns (to only select visible dags, not all dags) (#6804), [AIRFLOW-6095] Filter dags returned by task_stats (to only select visible dags, not all dags) (#6684), [AIRFLOW-4482] Add execution_date to trigger DagRun API response (#5260), [AIRFLOW-1076] Add get method for template variable accessor (#6793), [AIRFLOW-5194] Add error handler to action log (#5883), [AIRFLOW-5936] Allow explicit get_pty in SSHOperator (#6586), [AIRFLOW-5474] Add Basic auth to Druid hook (#6095), [AIRFLOW-5726] Allow custom filename in RedshiftToS3Transfer (#6396), [AIRFLOW-5834] Option to skip serve_logs process with airflow worker (#6709), [AIRFLOW-5583] Extend the DAG Details page to display the start_date / end_date (#6235), [AIRFLOW-6250] Ensure on_failure_callback always has a populated context (#6812), [AIRFLOW-6222] http hook logs response body for any failure (#6779), [AIRFLOW-6260] Drive _cmd config option by env var (AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_CMD for example) (#6801), [AIRFLOW-6168] Allow proxy_fix middleware of webserver to be configurable (#6723), [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. * (#24399), Task log templates are now read from the metadata database instead of, Minimum kubernetes library version bumped from. In the new behavior, the trigger_rule of downstream tasks is respected. components remain backwards compatible but raise a DeprecationWarning when imported from the old module. Disclaimer; there is still some inline configuration, but this will be removed eventually. The task is eligible for retry without going into FAILED state. The format was like. SSH Hook now uses the Paramiko library to create an ssh client connection, instead of the sub-process based ssh command execution previously (<1.9.0), so this is backward incompatible. EMRHook.create_job_flow has been changed to pass all keys to the create_job_flow API, rather than When a ReadyToRescheduleDep is run, it now checks whether the reschedule attribute on the operator, and always reports itself as passed unless it is set to True. Large prepared statements the instance configuration by changing the value controlled by the user - connection entry. Note that JSON serialization is stricter than pickling, so for example if you want to pass For production docker image related changes, see the Docker Image Changelog. TIMESTAMP AS OF and VERSION AS OF clauses for time every: '2 minute' for BigQuery, Athena, Snowflake, and Presto. This is controlled by We strive to ensure that there are no changes that may affect the end user and your files, but this release may contain changes that will require changes to your plugins, DAG File or other integration. which apply to most services. In most cases, the Timezone element is empty. The bucket_name is now optional. Previous versions of Airflow took additional arguments and displayed a message on the console. Now users instead of import from airflow.utils.files import TemporaryDirectory should will discover its config file using the $AIRFLOW_CONFIG and $AIRFLOW_HOME Returns the current timestamp as of the start of the query. The REMOTE_BASE_LOG_FOLDER configuration key in your airflow config has been removed, therefore you will need to take the following steps: Copy the logging configuration from airflow/config_templates/airflow_logging_settings.py. The all extras were reduced to include only user-facing dependencies. For more details about the Python logging, please refer to the official logging documentation. ALL. Indicates whether current or previous generation instance types are included. Formerly the core code was maintained by the original creators - Airbnb. that have a number of security issues fixed. Please see AIRFLOW-1455. The new pool config option allows users to choose different pool Discovery API to native google-cloud-build python library. This means that users now have access to the full Kubernetes API However when reading the new option, the old option will be checked to see if it exists. The ASF licenses this file, # to you under the Apache License, Version 2.0 (the, # "License"); you may not use this file except in compliance, # with the License. You should still pay attention to the changes that SELECT DISTINCT clause. If you are using DAGs Details API endpoint, use max_active_tasks instead of concurrency. Since 1.10.12, when such skipped tasks are cleared, see a deprecation warning. behavior can be overridden by sending replace_microseconds=true along with an explicit execution_date. Point objects that have the minimum distance of any two points use the latest Iceberg SDK to fix the table and update the column information in The WasbHook in Apache Airflow use a legacy version of Azure library. based on groups. Kubernetes version is described in Installation prerequisites. This statistic is returned only if you included it in the Metrics array in your request. for both libraries overlap. TaskRunner: notify of component start and finish (#27855), Add DagRun state change to the Listener plugin system(#27113), Metric for raw task return codes (#27155), Add logic for XComArg to pull specific map indexes (#27771), Add critical section query duration metric (#27700), Add: #23880 :: Audit log for AirflowModelViews(Variables/Connection) (#24079, #27994, #27923), Expand tasks in mapped group at run time (#27491), scheduler_job, add metric for scheduler loop timer (#27605), Allow datasets to be used in taskflow (#27540), Add expanded_ti_count to ti context (#27680), Add user comment to task instance and dag run (#26457, #27849, #27867), Enable copying DagRun JSON to clipboard (#27639), Implement extra controls for SLAs (#27557), Add max_wait for exponential_backoff in BaseSensor (#27597), Expand tasks in mapped group at parse time (#27158), Add disable retry flag on backfill (#23829), Filtering datasets by recent update events (#26942), Support Is /not Null filter for value is None on webui (#26584), Split out and handle params in mapped operator (#26100), Add authoring API for TaskGroup mapping (#26844), Create a more efficient airflow dag test command that also has better local logging (#26400), Support add/remove permissions to roles commands (#26338), Add triggerer info to task instance in API (#26249), Flag to deserialize value on custom XCom backend (#26343), UI: Update offset height if data changes (#27865), Improve TriggerRuleDep typing and readability (#27810), Make views requiring session, keyword only args (#27790), Optimize TI.xcom_pull() with explicit task_ids and map_indexes (#27699), Allow hyphens in pod id used by k8s executor (#27737), optimise task instances filtering (#27102), Use context managers to simplify log serve management (#27756), Improve sensor timeout messaging (#27733), Align TaskGroup semantics to AbstractOperator (#27723), Add new files to parsing queue on every loop of dag processing (#27060), Make Kubernetes Executor & Scheduler resilient to error during PMH execution (#27611), Separate dataset deps into individual graphs (#27356), Use log.exception where more economical than log.error (#27517), Move validation branch_task_ids into SkipMixin (#27434), Coerce LazyXComAccess to list when pushed to XCom (#27251), Update cluster-policies.rst docs (#27362), Add warning if connection type already registered within the provider (#27520), Activate debug logging in commands with verbose option (#27447), Add classic examples for Python Operators (#27403), Improve reset_dag_run description (#26755), Add examples and howtos about sensors (#27333), Make grid view widths adjustable (#27273), Sorting plugins custom menu links by category before name (#27152), Simplify DagRun.verify_integrity (#26894), Add mapped task group info to serialization (#27027), Correct the JSON style used for Run config in Grid View (#27119), No extra__conn_type__ prefix required for UI behaviors (#26995), Rename kubernetes config section to kubernetes_executor (#26873), decode params for dataset searches (#26941), Get rid of the DAGRun details page & rely completely on Grid (#26837), Fix scheduler crashloopbackoff when using hostname_callable (#24999), Reduce log verbosity in KubernetesExecutor. It has been removed. Error message: For SELECT DISTINCT, [core] max_active_tasks_per_dag. There is currently one parameter The 'text' appears to have fractional seconds and a timezone offset. of the operators had PROJECT_ID mandatory. (#15210), Make task ID on legend have enough width and width of line chart to be 100%. from_encoded_polyline(varchar) Decodes Similarly, if you were using DagBag().store_serialized_dags property, change it to (#21074), Better multiple_outputs inferral for @task.python (#20800), Improve handling of string type and non-attribute template_fields (#21054), Remove un-needed deps/version requirements (#20979), Correctly specify overloads for TaskFlow API for type-hinting (#20933), Introduce notification_sent to SlaMiss view (#20923), Rewrite the task decorator as a composition (#20868), Add Greater/Smaller than or Equal to filters in the browse views (#20602) (#20798), Rewrite DAG run retrieval in task command (#20737), Speed up creation of DagRun for large DAGs (5k+ tasks) by 25-130% (#20722), Make native environment Airflow-flavored like sandbox (#20704), Better error when param value has unexpected type (#20648), Add filter by state in DagRun REST API (List Dag Runs) (#20485), Prevent exponential memory growth in Tasks with custom logging handler (#20541), Set default logger in logging Mixin (#20355), Reduce deprecation warnings from www (#20378), Add hour and minute to time format on x-axis of all charts using nvd3.lineChart (#20002), Add specific warning when Task asks for more slots than pool defined with (#20178), UI: Update duration column for better human readability (#20112), Use Viewer role as example public role (#19215), Properly implement DAG param dict copying (#20216), ShortCircuitOperator push XCom by returning python_callable result (#20071), Add clear logging to tasks killed due to a Dagrun timeout (#19950), Change log level for Zombie detection messages (#20204), Only execute TIs of running DagRuns (#20182), Check and run migration in commands if necessary (#18439), Increase length of the email and username (#19932), Add more filtering options for TIs in the UI (#19910), Dynamically enable Test Connection button by connection type (#19792), Avoid littering postgres server logs with could not obtain lock with HA schedulers (#19842), Renamed Connection.get_hook parameter to make it the same as in SqlSensor and SqlOperator. (#23528), Only count bad refs when moved table exists (#23491), Visually distinguish task group summary (#23488), Remove color change for highly nested groups (#23482), Optimize 2.3.0 pre-upgrade check queries (#23458), Add backward compatibility for core__sql_alchemy_conn__cmd (#23441), Fix literal cross product expansion (#23434), Fix broken task instance link in xcom list (#23367), fix cli airflow dags show for mapped operator (#23339), Hide some task instance attributes (#23338), Dont show grid actions if server would reject with permission denied (#23332), Use run_id for ti.mark_success_url (#23330), Use