Apache Airflow™ is an open-source platform for orchestrating batch workflows. In Airflow, pipelines are defined in Python code as directed acyclic graphs (DAGs), meaning you can generate workflows dynamically and connect them with virtually any technology, either through ready-made packages from third-party providers or your own extensions. A modular architecture of Airflow and a built-in message queue ensures its high scalability. Airflow’s user interface provides overviews and in-depth views of pipelines and tasks.
You can deploy Apache Airflow on your Nebius AI Compute Cloud virtual machines using this Marketplace product.
Warning
If you are going to use this product in production, we recommend to configure it according to the Airflow recommendations.
-
Click the button in this card to go to VM creation. The image will be automatically selected under Image/boot disk selection.
-
Under Network settings, enable a public IP address for the VM.
-
Under Access, paste the public key from the pair into the SSH key field.
-
Create the VM.
-
To access the Airflow UI, go to
http://<public_IP_address_of_VM>
in your web browser and use the following credentials:- Username:
test_admin
. - Password: The ID of your VM. See the guide on how to get information about a VM.
After logging in, reset the admin’s password in the UI: under the user picture, click Your profile → Reset my password.
- Username:
- Developing, maintaining, and scheduling workflows as code.
- Connecting workflows with other technologies.
- Monitoring workflows and tasks.
Nebius AI does not provide technical support for the product. If you have any issues, please refer to the developer’s information resources.