'Airflow on_failure_callback & SlackAPIPostOperator
Trying to post to Slack using the on_failure_callback. I have this procedure below and my default dag args has
'on_failure_callback' : notify_slack_failure
When my task fails, it is creating the output in /tmp..but my Slack message is not posting. When I test it via command line it will post to Slack. Any thoughts on what I'm missing?
def notify_slack_failure(context):
"""
Define the callback to post on Slack if a failure is detected in the Workflow
:return: operator.execute
"""
cmd = "echo '" + getSlackToken() +"' > /tmp/a.out"
os.system("touch /tmp/b.out")
os.system(cmd)
text_message='333'
#text_message=str(context['task_instance'])
operator = SlackAPIPostOperator(
task_id='failure',
text=text_message,
token=getSlackToken(),
channel=SLACK_CHANNEL,
username=SLACK_USER
)
os.system("touch /tmp/e1.out")
return operator.execute(context=context)
Solution 1:[1]
George -Welcome to SO! Try running this code. See if Airflow is able to post to Slack, and if the code writing to the tmp file was messing it up.
def notify_slack_failure(contextDictionary, **kwargs):
"""
Define the callback to post on Slack if a failure is detected in the Workflow
:return: operator.execute
"""
text_message='333'
#text_message=str(context['task_instance'])
operator= SlackAPIPostOperator(
task_id='failure',
token=getSlackToken(),
text=text_message,
channel=SLACK_CHANNEL,
username=SLACK_USER,)
return operator.execute
Taken from SO Slack Airflow answer
Solution 2:[2]
In the command line I had my http_proxy vars defined and did not have them set for the daemon processes.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Zack |
| Solution 2 | StaceyGirl |
