'Amazon S3 downloads index.html instead of serving
I've set up Amazon S3 to serve my static site, speakeasylinguistics.com. All of the DNS stuff seems to be working okay, because dig +recurse +trace www.speakeasylinguistics.com outputs the correct DNS info.
But when you visit the site in a browser using the endpoint, the index.html page downloads, instead of being served. How do I fix this?
I've tried Chrome, Safari, FF. It happens on all of them. I used Amazon's walkthrough on hosting a custom domain to a T.
Solution 1:[1]
If you are using Hashicorp Terraform you can specify the content-type on an aws_s3_bucket_object as follows
resource "aws_s3_bucket_object" "index" {
bucket = "yourbucketnamehere"
key = "index.html"
content = "<h1>Hello, world</h1>"
content_type = "text/html"
}
This should serve your content appropriately in the browser.
Edit 24/05/22: As mentioned in the comments on this answer, Terraform now has a module to help with uploading files and setting their content-type attribute correctly
Solution 2:[2]
If you are doing this programmatically you can set the ContentType and/or ContentDisposition params in your upload.
[PHP Example]
$output = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => md5($share). '.html',
'ContentType' => 'text/html',
'Body' => $share,
));
Solution 3:[3]
if you guys are trying to upload it with Boto3 and python 3.7 or above try with
s3 = boto3.client('s3')
S3.upload_file(local_file,bucket,S3_file,ExtraArgs={'ContentType':'text/html'})
for update Content-Type
Solution 4:[4]
For anyone else facing this issue, there's a typo in the URL you can find under Properties > Static website hosting. For instance, the URL provided is
http://{bucket}.s3-website-{region}.amazonaws.com
but it should be
http://{bucket}.s3-website.{region}.amazonaws.com
Note the . between website and region.
Solution 5:[5]
I have recently had the same issue popping up, the problem was a change of behavior of CloudFront & S3 Origin, If your S3Bucket is configured to serve a static website, you need to change your origin to be the HTTPS:// endpoint instead of picking the S3 origin from the pulldown, if you are using terraform, your origin should be aws_s3_bucket.var.website_endpoint instead of aws_s3_bucket.var.bucket_domain_name
Refer to the AWS documentation here
Solution 6:[6]
I recently came across this issue and the root cause seems to be that object versioning was enabled. After disabling versioning on the bucket the index HTML was served as expected.
Solution 7:[7]
I had the same problem when uploading to an S3 static site from NodeJS. As others have mentioned, the issue was caused by missing the content-type when uploading the file. When using the web interface, the content-type is automatically applied for you; however, when manually uploading you will need to specify it. List of S3 Content Types.
In NodeJS, you can attach the content type like so:
const { extname } = require('path');
const { createReadStream } = require('fs');
// add more types as needed
const getMimeType = ext => {
switch (ext) {
case '.js':
return 'application/javascript';
case '.html':
return 'text/html';
case '.txt':
return 'text/plain';
case '.json':
return 'application/json';
case '.ico':
return 'image/x-icon';
case '.svg':
return 'image/svg+xml';
case '.css':
return 'text/css'
case '.jpg':
case '.jpeg':
return 'image/jpeg';
case '.png':
return 'image/png';
case 'webp':
return 'image/webp';
case 'map':
return 'binary/octet-stream'
default:
return 'application/octet-stream'
}
};
(async() => {
const file = './index.html';
const params = {
Bucket: 'myBucket',
Key: file,
Body: createReadStream(file),
ContentType: getMimeType(extname(file)),
};
await s3.putObject(params).promise();
})();
Solution 8:[8]
If you are using AWS S3 Bitbucket Pipelines Python, then add the parameter content_type as follow:
s3_upload.py
def upload_to_s3(bucket, artefact, bucket_key, content_type):
...
def main():
...
parser.add_argument("content_type", help="Content Type File")
...
if not upload_to_s3(args.bucket, args.artefact, args.bucket_key, args.content_type):
and modify bitbucket-pipelines.yml as follow:
...
- python s3_upload.py bucket_name file key content_type
...
Where content_type param can be one of following: MIME types (IANA media types)
Solution 9:[9]
I've been through the same issue and I have resolved this way. At S3 Bucket, click o index.html checkbox, click con Actions tab, Edit Metadata, and you will notice that in Metadata options says "Type: System defined, Key: Content-Type, Value: binary/octet-stream". Change Value and put "html" and save the changes. Then click at index.html, "Open" button. That worked for me.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | |
| Solution 2 | Brombomb |
| Solution 3 | dannisis |
| Solution 4 | CPak |
| Solution 5 | codaddict |
| Solution 6 | rpf3 |
| Solution 7 | John Galt |
| Solution 8 | e-israel |
| Solution 9 | Fernando Taboada |
