The Airflow BashOperator is a common operator used to execute bash commands as part of a data pipeline.
from airflow.operators.bash import BashOperator
execute_script = BashOperator(
task_id="execute_script",
bash_command="python /path/to/script.py",)
The BashOperator's functionality is very general since it can be used to run any bash command, and there exist richer integrations in Dagster for many common BashOperator use cases. We'll explain how 1-1 migration of the BashOperator to execute a bash command in Dagster, and how to use the dagster-airlift library to proxy the execution of the original task to Dagster. We'll also provide a reference for richer integrations in Dagster for common BashOperator use cases.
First, you'll need to ensure that the bash command you're running is available for use in both your Airflow and Dagster deployments. What this entails will vary depending on the command you're running. For example, if you're running a python script, it's as simple as ensuring the python script exists in a shared location accessible to both Airflow and Dagster, and all necessary env vars are set in both environments.
As mentioned above, you can use the PipesSubprocessClient to run a python script in a subprocess. But you can also modify this script to send additional information and logging back to Dagster. See the Dagster Pipes tutorial for more information.