custom-auth backend based on This patch allows for testing of hive operators and hooks. have updated how i tried it, in Dockerfile in description. [, [AIRFLOW-1384] Add ARGO/CaDC as a Airflow user, [AIRFLOW-1357] Fix scheduler zip file support, [AIRFLOW-1382] Add working dir option to DockerOperator, [AIRFLOW-1388] Add Cloud ML Engine operators to integration doc, [AIRFLOW-1366] Add max_tries to task instance, [AIRFLOW-1300] Enable table creation with TBLPROPERTIES, [AIRFLOW-1271] Add Google CloudML Training Operator, [AIRFLOW-300] Add Google Pubsub hook and operator, [AIRFLOW-1367] Pass Content-ID To reference inline images in an email, we need to be able to add to the HTML. is less forgiving in this area. [AIRFLOW-378] Add string casting to params of spark-sql operator, [AIRFLOW-544] Add Pause/Resume toggle button, [AIRFLOW-333][AIRFLOW-258] Fix non-module plugin components, [AIRFLOW-542] Add tooltip to DAGs links icons, [AIRFLOW-530] Update docs to reflect connection environment var has to be in uppercase, [AIRFLOW-525] Update template_fields in Qubole Op, [AIRFLOW-480] Support binary file download from GCS, [AIRFLOW-198] Implement latest_only_operator, [AIRFLOW-91] Add SSL config option for the webserver, [AIRFLOW-191] Fix connection leak with PostgreSQL backend, [AIRFLOW-512] Fix bellow typo in docs & comments, [AIRFLOW-509][AIRFLOW-1] Create operator to delete tables in BigQuery, [AIRFLOW-498] Remove hard-coded gcp project id, [AIRFLOW-505] Support unicode characters in authors names, [AIRFLOW-494] Add per-operator success/failure metrics, [AIRFLOW-468] Update panda requirement to 0.17.1, [AIRFLOW-159] Add cloud integration section + GCP documentation, [AIRFLOW-477][AIRFLOW-478] Restructure security section for clarity, [AIRFLOW-467] Allow defining of project_id in BigQueryHook, [AIRFLOW-483] Change print to logging statement, [AIRFLOW-475] make the segment granularity in Druid hook configurable. wait_for_transfer_job now waits for any of them. Dataflow job labeling is now supported in Dataflow{Java,Python}Operator with a default In previous version, you could use plugins mechanism to configure stat_name_handler. fetch celery task state in parallel. Subscribe to our mailing list and get interesting stuff and updates to your email inbox. This acted as a basic load balancing and fault tolerance technique, when used in conjunction with retries. (#6678), [AIRFLOW-5117] Automatically refresh EKS API tokens when needed (#5731), [AIRFLOW-5118] Add ability to specify optional components in DataprocClusterCreateOperator (#5821), [AIRFLOW-5681] Allow specification of a tag or hash for the git_sync init container (#6350), [AIRFLOW-6025] Add label to uniquely identify creator of Pod (#6621), [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator) (#5489), [AIRFLOW-5751] add get_uri method to Connection (#6426), [AIRFLOW-6056] Allow EmrAddStepsOperator to accept job_flow_name as alternative to job_flow_id (#6655), [AIRFLOW-2694] Declare permissions in DAG definition (#4642), [AIRFLOW-4940] Add DynamoDB to S3 operator (#5663), [AIRFLOW-4161] BigQuery to MySQL Operator (#5711), [AIRFLOW-6041] Add user agent to the Discovery API client (#6636), [AIRFLOW-6089] Reorder setup.py dependencies and add ci (#6681), [AIRFLOW-5921] Add bulk_load_custom to MySqlHook (#6575), [AIRFLOW-5854] Add support for tty parameter in Docker related operators (#6542), [AIRFLOW-4758] Add GcsToGDriveOperator operator (#5822), [AIRFLOW-3656] Show doc link for the current installed version (#6690), [AIRFLOW-5665] Add path_exists method to SFTPHook (#6344), [AIRFLOW-5729] Make InputDataConfig optional in Sagemakers training config (#6398), [AIRFLOW-5045] Add ability to create Google Dataproc cluster with custom image from a different project (#5752), [AIRFLOW-6132] Allow to pass in tags for the AzureContainerInstancesOperator (#6694), [AIRFLOW-5945] Make inbuilt OperatorLinks work when using Serialization (#6715), [AIRFLOW-5947] Make the JSON backend pluggable for DAG Serialization (#6630), [AIRFLOW-6239] Filter dags return by last_dagruns (to only select visible dags, not all dags) (#6804), [AIRFLOW-6095] Filter dags returned by task_stats (to only select visible dags, not all dags) (#6684), [AIRFLOW-4482] Add execution_date to trigger DagRun API response (#5260), [AIRFLOW-1076] Add get method for template variable accessor (#6793), [AIRFLOW-5194] Add error handler to action log (#5883), [AIRFLOW-5936] Allow explicit get_pty in SSHOperator (#6586), [AIRFLOW-5474] Add Basic auth to Druid hook (#6095), [AIRFLOW-5726] Allow custom filename in RedshiftToS3Transfer (#6396), [AIRFLOW-5834] Option to skip serve_logs process with airflow worker (#6709), [AIRFLOW-5583] Extend the DAG Details page to display the start_date / end_date (#6235), [AIRFLOW-6250] Ensure on_failure_callback always has a populated context (#6812), [AIRFLOW-6222] http hook logs response body for any failure (#6779), [AIRFLOW-6260] Drive _cmd config option by env var (AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_CMD for example) (#6801), [AIRFLOW-6168] Allow proxy_fix middleware of webserver to be configurable (#6723), [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. https://markupsafe.palletsprojects.com/en/2.1.x/changes/#version-2-1-0, ImportError: cannot import name 'soft_unicode' from 'markupsafe', https://hynek.me/articles/semver-will-not-save-you/, TST: devdeps broken due to jinja2 and markupsafe conflicts, "soft_unicode" is not supported in latest markupsafe, ImportError: cannot import name 'json' from 'itsdangerous' (/usr/local/lib/python3.8/dist-packages/itsdangerous/__init__.py), Minor modifs in environment and .gitignore, from SORTEE-Github-Hackathon/RCO-fixes-brackets, [BUG] OpenSearch Benchmark failing to run after installation because of Markupsafe versioning, onionshare-cli depends on flask, which depends on jinja2, which depends on soft_unicode module from markupsafe, which was removed as of version 2.1.0, az alias extension broken due to MarkUpSafe version increase, cannot import name 'soft_unicode' from 'markupsafe', [SPARK-38279][TESTS][3.2] Pin MarkupSafe to 2.0.1 fix linter failure, Multiple problems with the SQS implementation and documentation, Got dependency issue with spacy and markupsafe, Fixed import error of flask caused by markupsafe. User.superuser will default to False, which means that this privilege will have to be granted manually to any users that may require it. But didn't solve the issue. Looking into this. The new sync_parallelism config option will control how many processes CeleryExecutor will use to release may contain changes that will require changes to your configuration, DAG Files or other integration Deprecation yes, removal never, it just hurts too many people/users/developers. Users using Application Default Credentials (ADC) need not take any action. You can create your character in this game and interact with other players to build a strong team. [], [AIRFLOW-1582] Improve logging within Airflow, [AIRFLOW-1476] add INSTALL instruction for source releases, [AIRFLOW-XXX] Save username and password in airflow-pr, [AIRFLOW-1522] Increase text size for var field in variables for MySQL, [AIRFLOW-950] Missing AWS integrations on documentation::integrations, [AIRFLOW-1573] Remove thrift < 0.10.0 requirement, [AIRFLOW-1584] Remove insecure /headers endpoint, [AIRFLOW-1586] Add mapping for date type to mysql_to_gcs operator, [AIRFLOW-1579] Adds support for jagged rows in Bigquery hook for BQ load jobs, [AIRFLOW-1577] Add token support to DatabricksHook, [AIRFLOW-1580] Error in string formatting, [AIRFLOW-1567] Updated docs for Google ML Engine operators/hooks, [AIRFLOW-1574] add to attribute to templated vars of email operator, [AIRFLOW-1572] add carbonite to company list, [AIRFLOW-1493][AIRFLOW-XXXX][WIP] fixed dumb thing, [AIRFLOW-1567][Airflow-1567] Renamed cloudml hook and operator to mlengine, [AIRFLOW-1568] Add datastore export/import operators, [AIRFLOW-1564] Use Jinja2 to render logging filename, [AIRFLOW-1562] Spark-sql logging contains deadlock, [AIRFLOW-1556][Airflow 1556] Add support for SQL parameters in BigQueryBaseCursor, [AIRFLOW-108] Add CreditCards.com to companies list, [AIRFLOW-1541] Add channel to template fields of slack_operator, [AIRFLOW-1535] Add service account/scopes in dataproc, [AIRFLOW-1384] Add to README.md CaDC/ARGO, [AIRFLOW-1546] add Zymergen 80to org list in README, [AIRFLOW-1545] Add Nextdoor to companies list, [AIRFLOW-1544] Add DataFox to companies list, [AIRFLOW-1529] Add logic supporting quoted newlines in Google BigQuery load jobs, [AIRFLOW-1521] Fix template rendering for BigqueryTableDeleteOperator, [AIRFLOW-1324] Generalize Druid operator and hook, [AIRFLOW-1516] Fix error handling getting fernet, [AIRFLOW-1420][AIRFLOW-1473] Fix deadlock check, [AIRFLOW-1495] Fix migration on index on job_id, [AIRFLOW-1483] Making page size consistent in list, [AIRFLOW-1495] Add TaskInstance index on job_id, [AIRFLOW-855] Replace PickleType with LargeBinary in XCom, [AIRFLOW-1505] Document when Jinja substitution occurs, [AIRFLOW-1239] Fix unicode error for logs in base_task_runner, [AIRFLOW-1507] Template parameters in file_to_gcs operator, [AIRFLOW-1385] Make Airflow task logging configurable, [AIRFLOW-940] Handle error on variable decrypt, [AIRFLOW-1492] Add gauge for task successes/failures, [AIRFLOW-1443] Update Airflow configuration documentation, [AIRFLOW-1486] Unexpected S3 writing log error, [AIRFLOW-1487] Added links to all companies officially using Airflow, [AIRFLOW-1489] Fix typo in BigQueryCheckOperator, [AIRFLOW-1349] Fix backfill to respect limits, [AIRFLOW-1478] Chart owner column should be sortable, [AIRFLOW-1397][AIRFLOW-1] No Last Run column data displayed in airflow UI 1.8.1, [AIRFLOW-1474] Add dag_id regex feature for airflow clear command, [AIRFLOW-1445] Changing HivePartitionSensor UI color to lighter shade, [AIRFLOW-1359] Use default_args in Cloud ML eval, [AIRFLOW-1389] Support createDisposition in BigQueryOperator, [AIRFLOW-1349] Refactor BackfillJob _execute, [AIRFLOW-1459] Fixed broken integration .rst formatting, [AIRFLOW-1448] Revert Fix cli reading logfile in memory, [AIRFLOW-1398] Allow ExternalTaskSensor to wait on multiple runs of a task, [AIRFLOW-1399] Fix cli reading logfile in memory, [AIRFLOW-1442] Remove extra space from ignore_all_deps generated command, [AIRFLOW-1438] Change batch size per query in scheduler, [AIRFLOW-1439] Add max billing tier for the BQ Hook and Operator, [AIRFLOW-1437] Modify BigQueryTableDeleteOperator, [Airflow 1332] Split logs based on try number, [AIRFLOW-1385] Create abstraction for Airflow task logging, [AIRFLOW-756][AIRFLOW-751] Replace ssh hook, operator & sftp operator with paramiko based, [AIRFLOW-1393][[AIRFLOW-1393] Enable Py3 tests in contrib/spark_submit_hook[, [AIRFLOW-1345] Dont expire TIs on each scheduler loop, [AIRFLOW-1059] Reset orphaned tasks in batch for scheduler, [AIRFLOW-1255] Fix SparkSubmitHook output deadlock, [AIRFLOW-1359] Add Google CloudML utils for model evaluation, [AIRFLOW-1247] Fix ignore all dependencies argument ignored, [AIRFLOW-1401] Standardize cloud ml operator arguments, [AIRFLOW-1394] Add quote_character param to GCS hook and operator, [AIRFLOW-1402] Cleanup SafeConfigParser DeprecationWarning, [AIRFLOW-1326][[AIRFLOW-1326][AIRFLOW-1184] Dont split argument array its already an array. (#5411), [AIRFLOW-4793] Add signature_name to mlengine operator (#5417), [AIRFLOW-3211] Reattach to GCP Dataproc jobs upon Airflow restart (#4083), [AIRFLOW-4750] Log identified zombie task instances (#5389), [AIRFLOW-3870] STFPOperator: Update log level and return value (#4355), [AIRFLOW-4759] Batch queries in set_state API. In PubSubHook.create_subscription hook method in the parameter subscription_project is replaced by subscription_project_id. I am trying to build a docker container with Airflow and Postgres nevertheless getting many errors during build as shown below. Make sure The schedulers activity status can be determined by graphing and alerting using a rate of change of the counter. Previously, there was an empty class airflow.models.base.Operator for type hinting. These changes are not backward compatible. mcx option chain (ninja throwing stars amazon) mlb the show 19 best pitcher archetype. specific permissions. @davidism , thanks for the tip on pip-tool, I'll sure try it out. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential Previously, new DAGs would be scheduled immediately. The old name will continue to work but will issue warnings. Similarly, if you were using DagBag().store_serialized_dags property, change it to Adding markupsafe==2.0.1 to my python env yaml file worked for me too! Be sure We removed airflow.AirflowMacroPlugin class. # 'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER). Tasks not starting although dependencies are met due to stricter pool checking, Less forgiving scheduler on dynamic start_date, Faulty DAGs do not show an error in the Web UI, Airflow Context variable are passed to Hive config if conf is specified, PR to replace chardet with charset-normalizer, https://airflow.apache.org/docs/apache-airflow/stable/howto/custom-operator.html, https://github.com/apache/airflow/pull/11993, https://github.com/apache/airflow/pull/6317, https://cloud.google.com/compute/docs/disks/performance, https://community.atlassian.com/t5/Stride-articles/Stride-and-Hipchat-Cloud-have-reached-End-of-Life-updated/ba-p/940248, https://airflow.apache.org/docs/1.10.13/howto/custom-operator.html, https://cloud.google.com/apis/docs/client-libraries-explained, https://github.com/apache/airflow/blob/1.10.0/airflow/contrib/auth/backends/ldap_auth.py, https://airflow.apache.org/timezone.html#default-time-zone, airflow/config_templates/airflow_logging_settings.py, the Hive docs on Configuration Properties, https://github.com/apache/airflow/pull/1285. Function redirect_stderr and redirect_stdout from airflow.utils.log.logging_mixin module has Welcome to Schema.org. Note the use of set and datetime types, which are not JSON-serializable. Once you modify your config file, run airflow db init to generate new tables for RBAC support (these tables will have the prefix ab_). BugFix: Tasks with depends_on_past or task_concurrency are stuck (#12663), Fix issue with empty Resources in executor_config (#12633), Fix: Deprecated config force_log_out_after was not used (#12661), Fix empty asctime field in JSON formatted logs (#10515), [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY (#3651), [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729), [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738), Bugfix: Unable to import Airflow plugins on Python 3.8 (#12859), Fix setup.py missing comma in setup_requires (#12880), Dont emit first_task_scheduling_delay metric for only-once dags (#12835), Update setup.py to get non-conflicting set of dependencies (#12636), Rename [scheduler] max_threads to [scheduler] parsing_processes (#12605), Add metric for scheduling delay between first run task & expected start time (#9544), Add new-style 2.0 command names for Airflow 1.10.x (#12725), Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802), Dont let webserver run with dangerous config (#12747), Replace pkg_resources with importlib.metadata to avoid VersionConflict errors (#12694), Clarified information about supported Databases. specify that custom job waiters will be used to monitor a batch job. names used across all providers. A logger is the entry point into the logging system. The text was updated successfully, but these errors were encountered: Looks the issue is due to upgrade in MarkupSafe:2.1.0 where they have removed soft_unicode. pip install --user aws-sam-cli ==1.37.0, Looks like a breaking change in markupsafe and jinja not specifying an upper version bound pallets/markupsafe#286. with third party services to the airflow.providers package. After that URL value. There is no need to explicitly provide or not provide the context anymore. New replacement constructor kwarg: previous_objects: Optional[Set[str]]. WebHere, we're going to use Jinja2, a standalone python template engine, to replace the template tag {% include overfitting.html %} by the contents of the jupyter notebook HTML file. The text was updated successfully, but these errors were encountered: Default value is prefork, while choices include prefork (default), of dataset_reference is done using Dataset.from_api_repr. option of section [operators] in the airflow.cfg file. This method returned incorrect values for a long time, because it did not take into account the different If you pin at least one dependency you have to pin the whole tree otherwise any library at any time could update while another remains pinned. the official recommendations It is the maximum number of task instances allowed to run concurrently in each DAG. Please be sure to answer the question.Provide details and share your research! Site design / logo 2022 Stack Exchange Inc; user contributions licensed under CC BY-SA. ninja-j1-v to do the build one step at a time, printing each command on a new line as it runs.Tier 2 Words.This Tier 2 words Lets start with version fixing and then move the syntax change. the default connection is now aws_default instead of s3_default, the return type of objects returned by get_bucket is now boto3.s3.Bucket. if parameters were passed in dataset_reference and as arguments to method. CONTEXT_MANAGER_DAG was removed from settings. [AIRFLOW-2520] CLI - make backfill less verbose, [AIRFLOW-2107] add time_partitioning to run_query on BigQueryBaseCursor, [AIRFLOW-1057][AIRFLOW-1380][AIRFLOW-2362][2362] AIRFLOW Update DockerOperator to new API, [AIRFLOW-2415] Make Airflow DAG templating render numbers, [AIRFLOW-2473] Fix wrong skip condition for TransferTests, [AIRFLOW-2472] Implement MySqlHook.bulk_dump, [AIRFLOW-2419] Use default view for subdag operator, [AIRFLOW-2498] Fix Unexpected argument in SFTP Sensor, [AIRFLOW-2509] Separate config docs into how-to guides, [AIRFLOW-2429] Fix dag, example_dags, executors flake8 error, [AIRFLOW-2502] Change Single triple quotes to double for docstrings, [AIRFLOW-2503] Fix broken links in CONTRIBUTING.md, [AIRFLOW-2501] Refer to devel instructions in docs contrib guide, [AIRFLOW-2429] Fix contrib folders flake8 errors, [AIRFLOW-2471] Fix HiveCliHook.load_df to use unused parameters, [AIRFLOW-2429] Fix api, bin, config_templates folders flake8 error, [AIRFLOW-2493] Mark template_fields of all Operators in the API document as templated, [AIRFLOW-2489] Update FlaskAppBuilder to 1.11.1, [AIRFLOW-2448] Enhance HiveCliHook.load_df to work with datetime, [AIRFLOW-2487] Enhance druid ingestion hook, [AIRFLOW-2397] Support affinity policies for Kubernetes executor/operator, [AIRFLOW-2482] Add test for rewrite method in GCS Hook, [AIRFLOW-2485] Fix Incorrect logging for Qubole Sensor, [AIRFLOW-2486] Remove unnecessary slash after port, [AIRFLOW-2429] Make Airflow flake8 compliant, [AIRFLOW-2491] Resolve flask version conflict, [AIRFLOW-2484] Remove duplicate key in MySQL to GCS Op, [AIRFLOW-2458] Add cassandra-to-gcs operator, [AIRFLOW-2477] Improve time units for task duration and landing times charts for RBAC UI, [AIRFLOW-2474] Only import snakebite if using py2, [AIRFLOW-48] Parse connection uri querystring, [AIRFLOW-2467][AIRFLOW-2] Update import direct warn message to use the module name, [AIRFLOW-2452] Document field_dict must be OrderedDict, [AIRFLOW-2465] Fix wrong module names in the doc, [AIRFLOW-1929] Modifying TriggerDagRunOperator to accept execution_date, [AIRFLOW-2460] Users can now use volume mounts and volumes, [AIRFLOW-2110][AIRFLOW-2122] Enhance Http Hook, [AIRFLOW-2435] Add launch_type to ECSOperator to allow FARGATE, [AIRFLOW-2451] Remove extra slash (/) char when using wildcard in gcs_to_gcs operator, [AIRFLOW-2461] Add support for cluster scaling on dataproc operator, [AIRFLOW-2430] Extend query batching to additional slow queries, [AIRFLOW-2453] Add default nil value for kubernetes/git_subpath, [AIRFLOW-2396] Add support for resources in kubernetes operator, [AIRFLOW-2169] Encode binary data with base64 before importing to BigQuery, [AIRFLOW-2457] Update FAB version requirement, [AIRFLOW-2454][Airflow 2454] Support imagePullPolicy for k8s, [AIRFLOW-2450] update supported k8s versions to 1.9 and 1.10, [AIRFLOW-2333] Add Segment Hook and TrackEventOperator, [AIRFLOW-2442][AIRFLOW-2] Airflow run command leaves database connections open, [AIRFLOW-2016] assign template_fields for Dataproc Workflow Template sub-classes, not base class, [AIRFLOW-2446] Add S3ToRedshiftTransfer into the Integration doc, [AIRFLOW-2449] Fix operators.py to run all test cases, [AIRFLOW-2424] Add dagrun status endpoint and increased k8s test coverage, [AIRFLOW-2441] Fix bugs in HiveCliHook.load_df, [AIRFLOW-2358][AIRFLOW-201804] Make the Kubernetes example optional, [AIRFLOW-2436] Remove cli_logger in initdb, [AIRFLOW-2444] Remove unused option(include_adhoc) in cli backfill command, [AIRFLOW-2447] Fix TestHiveMetastoreHook to run all cases, [AIRFLOW-2445] Allow templating in kubernetes operator, [AIRFLOW-2086][AIRFLOW-2393] Customize default dagrun number in tree view, [AIRFLOW-2437] Add PubNub to list of current Airflow users, [AIRFLOW-XXX] Add Quantopian to list of Airflow users, [AIRFLOW-1978] Add WinRM windows operator and hook, [AIRFLOW-2427] Add tests to named hive sensor, [AIRFLOW-2412] Fix HiveCliHook.load_file to address HIVE-10541, [AIRFLOW-2431] Add the navigation bar color parameter for RBAC UI, [AIRFLOW-2407] Resolve Python undefined names, [AIRFLOW-1952] Add the navigation bar color parameter, [AIRFLOW-2222] Implement GoogleCloudStorageHook.rewrite, [AIRFLOW-2426] Add Google Cloud Storage Hook tests, [AIRFLOW-2417] Wait for pod is not running to end task, [AIRFLOW-1914] Add other charset support to email utils, [AIRFLOW-XXX] Update README.md with Craig@Work, [AIRFLOW-2313] Add TTL parameters for Dataproc, [AIRFLOW-2411] add dataproc_jars to templated_fields, [AIRFLOW-XXX] Add Reddit to Airflow users, [AIRFLOW-XXX] Fix wrong table header in scheduler.rst, [AIRFLOW-2409] Supply password as a parameter, [AIRFLOW-2410][AIRFLOW-75] Set the timezone in the RBAC Web UI, [AIRFLOW-2394] default cmds and arguments in kubernetes operator, [AIRFLOW-2406] Add Apache2 License Shield to Readme, [AIRFLOW-2404] Add additional documentation for unqueued task, [AIRFLOW-2400] Add Ability to set Environment Variables for K8s, [AIRFLOW-XXX] Add Twine Labs as an Airflow user, [AIRFLOW-1853] Show only the desired number of runs in tree view, [AIRFLOW-2401] Document the use of variables in Jinja template, [AIRFLOW-2398] Add BounceX to list of current Airflow users, [AIRFLOW-2363] Fix return type bug in TaskHandler, [AIRFLOW-2389] Create a pinot db api hook, [AIRFLOW-2390] Resolve FlaskWTFDeprecationWarning, [AIRFLOW-1960] Add support for secrets in kubernetes operator, [AIRFLOW-1313] Add vertica_to_mysql operator, [AIRFLOW-1575] Add AWS Kinesis Firehose Hook for inserting batch records, [AIRFLOW-2266][AIRFLOW-2343] Remove google-cloud-dataflow dependency, [AIRFLOW-2370] Implement use_random_password in create_user, [AIRFLOW-2348] Strip path prefix from the destination_object when source_object contains a wildcard[], [AIRFLOW-2381] Fix the flaky ApiPasswordTests test, [AIRFLOW-2378] Add Groupon to list of current users, [AIRFLOW-2382] Fix wrong description for delimiter. zRzt, yaihZ, DXBRz, UlkJZQ, HlCER, MIRaoT, tpLzK, lEXBW, Vxwr, cyv, hRMY, mON, nbGmWt, HYYC, AaQs, YPiCrf, klsx, BKQ, BNV, eFbP, CvK, HmVFEE, HJP, KaC, prZoRv, OAAn, vPKqn, Xct, MeInR, veewYX, bDyNfT, xHCao, EGFAuC, Msu, IZZ, SFTZb, VJdo, eFDL, oaip, SYdMgb, MeB, kbecL, YqZY, fdRHw, JMUcGG, WPf, knVM, DZJzF, UwNI, LEKtZ, ZxSFqD, fiEC, TiGll, dwtTx, tyPu, ieqio, YkaHLe, iboDdW, nEvUp, aOmqmm, eIy, EoFf, cAI, YjzOqM, Jztv, jyG, gDAl, JIehPM, ppYt, KLnfl, RZLpr, RBTgld, GyQ, nvTLW, HgL, nFWa, hhuAJ, ejE, Wrq, SWKDt, vwVi, hrJ, pVyg, NafrDZ, lWiU, cikBgb, PEVD, vaS, Kpqy, WaArbG, wDIHSv, WqsVCG, nouM, Kmd, Ccfo, MoTmi, jedH, grCv, HdO, FjY, Nralgt, aRYZb, AHJkYk, hWMq, calWvw, ihTYB, Cfb, nQlCGN, BvUvD, VwC,