'Automate AWS Connect Contact-Flows between AWS accounts?
Has anyone had any experience or been able to automate contact flow creation between AWS connect accounts ? So a contact flow in aws account a, and have that created in aws account b ? We've been doing this manually via the GUI, exporting the contact flow, and then importing in the other aws account, then having to update it for the lamdbas, lex bots or other flows that may be referenced in it ?
I created a bash script which does a bit of terraforming as well, and i've found while it works for some for simple contact flows, it doesn't work for others that reference other contact flows within it;
DEV_INSTANCE_ID=
STAGING_INSTANCE_ID=
PROD_INSTANCE_ID=
LEX_D=awln_lex_d
LEX_S=awln_lex_s
LEX_P=awln_lex_p
REGION=eu-west-2
DIR=
ACCOUNT_D=d-connect
ACCOUNT_S=s-connect
ACCOUNT_P=p-connect
ACCOUNT_ID_D=
ACCOUNT_ID_S=
ACCOUNT_ID_P=
#Questions to be asked for contact flow
read -p "Please state what contact flow (exact name) you would like to export from dev (d-connect): " export
read -p "Which account would you like this contact flow created in? (d-connect or s-connect): " account
read -p "Please give a description of the contact flow: " description
read -p "What type of contact flow is this (CONTACT_FLOW, CUSTOMER_QUEUE, CUSTOMER_HOLD, CUSTOMER_WHISPER, AGENT_HOLD, AGENT_WHISPER, OUTBOUND_WHISPER, AGENT_TRANSFER, QUEUE_TRANSFER): " contact_flow
# aws cli commands
cli_commands () {
CLI=$(aws connect list-contact-flows --instance-id $DEV_INSTANCE_ID --profile $ACCOUNT_D --region $REGION | grep -C 2 $export | grep Id | awk '{print $2 }' | tr -d \",)
aws connect describe-contact-flow --instance-id $DEV_INSTANCE_ID --contact-flow-id $CLI --profile $ACCOUNT_D --region $REGION | jq '.ContactFlow.Content | fromjson' > $export.json
}
main () {
if [ "$account" == "$ACCOUNT_S" ];
then
echo "cloning repo"
git clone
cd /contact-flow-automation/$account/
mv $DIR/$export.json $DIRcontact-flow-automation/$account/
sed -i "s/$ACCOUNT_ID_D/$ACCOUNT_ID_S/g" $DIR/contact-flow-automation/$account/$export.json
sed -i "s/$LEX_D/$LEX_S/g" $DIR/contact-flow-automation/$account/$export.json
sed -i "s/$DEV_INSTANCE_ID/$STAGING_INSTANCE_ID/g" $DIR/contact-flow-automation/$account/$export.json
cat << EOF >> $DIR/contact-flow-automation/$account/main.tf
resource "aws_connect_contact_flow" "$export" {
instance_id = "$STAGING_INSTANCE_ID"
name = "$export"
description = "$description"
type = "$contact_flow"
filename = "$export.json"
content_hash = filebase64sha256("$export.json")
}
EOF
terraform init
echo yes | terraform apply --var-file=terraform.tfvars
git add .
git commit -m "automation update for contact flow $export to $account"
git push origin main
elif [ "$account" == "$ACCOUNT_P" ];
then
echo "cloning repo"
git clone
cd contact-flow-automation/$account/
mv $DIR/$export.json $DIR/contact-flow-automation/$account/
sed -i "s/$ACCOUNT_ID_D/$ACCOUNT_ID_P/g" $DIR/contact-flow-automation/$account/$export.json
sed -i "s/$LEX_D/$LEX_P/g" $DIR/contact-flow-automation/$account/$export.json
sed -i "s/$DEV_INSTANCE_ID/$PROD_INSTANCE_ID/g" $DIR/contact-flow-automation/$account/$export.json
cat << EOF >> $DIR/contact-flow-automation/$account/main.tf
resource "aws_connect_contact_flow" "$export" {
instance_id = "$PROD_INSTANCE_ID"
name = "$export"
description = "$description"
type = "$contact_flow"
filename = "$export.json"
content_hash = filebase64sha256("$export.json")
}
EOF
terraform init
echo yes | terraform apply --var-file=terraform.tfvars
git add .
git commit -m "automation update for contact flow $export to $account"
git push origin main
fi
}
cli_commands
main
echo "You need to save/publish the $export contact flow"
So the issue sometimes is terraform fails because it a particular flow contains reference to another flow and the ID in the .json is incorrect. Just wondering if anyone has been able to automate this in anyway ?
Thanks
Solution 1:[1]
I finally got a script together to achieve this, this will look at other contact flows within the flow you're trying to export/import, and get the ids for and any queues as well.
#!/bin/bash
DEV_INSTANCE_ID=
STAGING_INSTANCE_ID=
PROD_INSTANCE_ID=
LEX_D=awln_lex_d
LEX_S=awln_lex_s
LEX_P=awln_lex_p
REGION=eu-west-2
DIR=
ACCOUNT_D=d-connect
ACCOUNT_S=s-connect
ACCOUNT_P=p-connect
ACCOUNT_ID_D=
ACCOUNT_ID_S=
ACCOUNT_ID_P=
#Questions to be asked for contact flow
read -p "Please state what contact flow (exact name) you would like to
export from dev (d-connect): " export
read -p "Which account would you like this contact flow created in? (d-connect , s-connect or p-connect): " account
read -p "Please give a description of the contact flow: " description
# aws cli commands
cli_commands () {
CLI=$(aws connect list-contact-flows --instance-id $DEV_INSTANCE_ID --
profile $ACCOUNT_D --region $REGION | grep -C 2 $export | grep Id | awk
'{print $2 }' | tr -d \",)
aws connect describe-contact-flow --instance-id $DEV_INSTANCE_ID --
contact-flow-id $CLI --profile $ACCOUNT_D --region $REGION | jq
'.ContactFlow.Content | fromjson' > $export.json
aws connect list-contact-flows --instance-id $DEV_INSTANCE_ID --profile
$ACCOUNT_D --region $REGION >> dev-contact-flows.json
aws connect list-contact-flows --instance-id $STAGING_INSTANCE_ID --
profile $ACCOUNT_S --region $REGION >> staging-contact-flows.json
aws connect list-contact-flows --instance-id $PROD_INSTANCE_ID --profile
$ACCOUNT_P --region $REGION >> prod-contact-flows.json
aws connect list-queues --instance-id $DEV_INSTANCE_ID --profile
$ACCOUNT_D --region $REGION >> dev-queues.json
aws connect list-queues --instance-id $STAGING_INSTANCE_ID --profile
$ACCOUNT_S --region $REGION >> staging-queues.json
aws connect list-queues --instance-id $PROD_INSTANCE_ID --profile
$ACCOUNT_P --region $REGION >> prod-queues.json
CL_TYPE=$(jq -r --arg export $export
'.ContactFlowSummaryList[]|select(.Name==$export).ContactFlowType' dev-
contact-flows.json )
grep -wA 2 "ContactFlow" $export.json | grep text | awk '{print$2}' | tr
-d \" | sort -u >> contact-flows.txt
grep -wA 2 "queue" $export.json | grep text | awk '{print $2" "$3}' | tr
-d \" | sort -u >> queues.txt
}
queues () {
cat queues.txt | while read -r LINE; do
jq --arg LINE "$LINE" -r '.QueueSummaryList[]| select(.Name==$LINE).Id'
dev-queues.json >> dev-queues-id.txt
done
cat queues.txt | while read -r LINE; do
jq --arg LINE "$LINE" -r '.QueueSummaryList[]| select(.Name==$LINE).Id'
staging-queues.json >> staging-queues-id.txt
done
cat queues.txt | while read -r LINE; do
jq --arg LINE "$LINE" -r '.QueueSummaryList[]| select(.Name==$LINE).Id'
prod-queues.json >> prod-queues-id.txt
done
}
contact_flows () {
cat contact-flows.txt | while read LINE; do
jq --arg LINE $LINE -r '.ContactFlowSummaryList[]|
select(.Name==$LINE).I
d' dev-contact-flows.json >> dev-contact-flow-ids.txt
done
cat contact-flows.txt | while read LINE; do
jq --arg LINE $LINE -r '.ContactFlowSummaryList[]|
select(.Name==$LINE).Id' staging-contact-flows.json >> staging-contact-
flow-ids.txt
done
cat contact-flows.txt | while read LINE; do
jq --arg LINE $LINE -r '.ContactFlowSummaryList[]|
select(.Name==$LINE).Id' prod-contact-flows.json >> prod-contact-flow-
ids.txt
done
}
main () {
if [ "$account" == "$ACCOUNT_S" ];
then
paste dev-contact-flow-ids.txt staging-contact-flow-ids.txt >>
combined-contact-flows.txt
cat combined-contact-flows.txt | while read LINE1 LINE2; do
sed -i "s/$LINE1/$LINE2/g" $export.json
done
paste dev-queues-id.txt staging-queues-id.txt >> combined-queues.txt
cat combined-queues.txt | while read LINE1 LINE2; do
sed -i "s/$LINE1/$LINE2/g" $export.json
done
echo "cloning repo"
git clone [email protected]:
cd /contact-flow-automation/$account/
mv $DIR/$export.json $DIR/contact-flow-
automation/$account/
sed -i -e "s/$ACCOUNT_ID_D/$ACCOUNT_ID_S/g" -e "s/$LEX_D/$LEX_S/g" -e
"s/$DEV_INSTANCE_ID/$STAGING_INSTANCE_ID/g" $DIR/
contact-flow-automation/$account/$export.json
cat << EOF >> $DIR/contact-flow-automation/$account/main.tf
resource "aws_connect_contact_flow" "$name" {
instance_id = "$STAGING_INSTANCE_ID"
name = "$export"
description = "$description"
type = "$CL_TYPE"
filename = "$export.json"
content_hash = filebase64sha256("$export.json")
}
EOF
terraform init
terraform apply --var-file=terraform.tfvars -auto-approve
git add .
git commit -m "automation update for contact flow $export to $account"
git push origin main
elif [ "$account" == "$ACCOUNT_P" ];
then
paste dev-contact-flow-ids.txt prod-contact-flow-ids.txt >> combined-
contact-flows.txt
cat combined-contact-flows.txt | while read LINE1 LINE2; do
sed -i "s/$LINE1/$LINE2/g" $export.json
done
paste dev-queues-id.txt prod-queues-id.txt >> combined-queues.txt
cat combined-queues.txt | while read LINE1 LINE2; do
sed -i "s/$LINE1/$LINE2/g" $export.json
done
echo "cloning repo"
git clone [email protected]:
cd contact-flow-automation/$account/
mv $DIR/$export.json $DIR/contact-flow-automation/$account/
sed -i -e "s/$ACCOUNT_ID_D/$ACCOUNT_ID_P/g" -e "s/$LEX_D/$LEX_P/g" -e
"s/$DEV_INSTANCE_ID/$PROD_INSTANCE_ID/g" $DIR/contact-flow-
automation/$account/$export.json
cat << EOF >> $DIR/contact-flow-automation/$account/main.tf
resource "aws_connect_contact_flow" "$export" {
instance_id = "$PROD_INSTANCE_ID"
name = "$export"
description = "$description"
type = "$CL_TYPE"
filename = "$export.json"
content_hash = filebase64sha256("$export.json")
}
EOF
terraform init
terraform apply --var-file=terraform.tfvars -auto-approve
git add .
git commit -m "automation update for contact flow $export to $account"
git push origin main
fi
}
cli_commands
queues
contact_flows
main
#clean-up
cd $DIR
rm -rf dev-contact-flows.json staging-contact-flows.json prod-contact-
flows.json dev-queues.json staging-queues.json prod-queues.json contact-
flows.txt queues.txt dev-queues-id.txt staging-queues-id.txt prod-
queues-id.txt dev-contact-flow-ids.txt staging-contact-flow-ids.txt
prod-contact-flow-ids.txt combined-contact-flows.txt combined-queues.txt
echo "You need to save/publish the $export contact flow"
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Liam |
