'How do I solve the AWSSecurityTokenServiceException error in Databricks?

I have been trying to read some data from an AWS S3 bucket to Databricks. The S3 bucket is development/team/user. I am using the Scala statement:

val test = spark.read.format("team").load("/mnt/development/team/user/data.txt")

and I get the following:

com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: User [user] is not authorized to perform sts:AssumeRole on resource: [resource]

How do I resolve this error? Is there a problem with the Databricks cluster I am running? Is there a way to allow my Databricks role to perform sts:AssumeRole in AWS? I am not sure if or how I can do either.



Solution 1:[1]

Yes, I have done this before. Here you are getting this error because STS assumes a role you can set up by selecting STS policy in the role while creating policy and you can update the trust relationship also.

This link will help you https://forums.aws.amazon.com/thread.jspa?threadID=266814

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Aviral Bhardwaj