'Can I mark an Airflow SSHOperator task Execution_timout as Success

I have tasks in dag that executes bash scripts over ssh on more than one machine. I have configured execution_timeouts for these tasks as well. But when the execution_timeout occurs it marks the current task as failed and marks the next tasks as upsteam_failed. Hence one failed task causes other tasks to fail as well.

I have already used AirflowTaskTimeout with PythonOperator and it works just fine, i have pasted the example below. But for SSHOperator I can't call a python function to execute bash scripts on a remote machine and catch exceptions.

def func():
    try:
        time.sleep(40)
    except AirflowTaskTimeout:
        return 'success'

t1 = PythonOperator(
    task_id="task1",
    python_callable=func,
    execution_timeout=timedelta(seconds=30),
    dag=dag
    )

t3 = SSHOperator(
    task_id='task3',
    ssh_hook=sshHook5,
    command="sleep 40",
    execution_timeout=timedelta(seconds=30),
    dag=dag
    )

I want to know is there a workaround (in case of SSHOperator/BashOperator) that i can use to mark a timed out task as successful.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source