'Nginx Upstream prematurely closed FastCGI stdout while reading response header from upstream
I am aware that there are many similar questions posted on stackoverflow, I have been through them but it still doesn't resolve my issue. Please read on before marking it as a similar question.
I have hosted my laravel based web application through the use of Nginx. The web application is accessible just fine both locally and through the server. However there is a particular URL where when too much data is being returned, it results in the server crashing and returning 404 error.
In nginx error logs the following error message is shown
Nginx Upstream prematurely closed FastCGI stdout while reading response header from upstream
Attempted Solution I have tried adjusting settings from both PHP ini files as well as nginx conf files to no avail. I also restarted the server using
systemctl restart nginx
systemctl restart php-fpm
PHP.ini
upload_max_filesize = 256M
post_max_size = 1000M
nginx conf
client_max_body_size 300M;
client_body_timeout 2024;
client_header_timeout 2024;
fastcgi_buffers 16 512k
fastcgi_buffer_size 512k
fastcgi_read_timeout 500;
fastcgi_send_timeout 500;
Can someone kindly tell me what i am missing out?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
