Posts

Showing posts with the label Nginx

403 Forbidden On Nginx/1.4.6 (Ubuntu) - Laravel

Answer : You need to specify an absolute path for your root directive. Nginx uses the directory set at compile time using the --prefix switch. By default this is /usr/local/nginx . What this means is that your root, which is currently set to root home/laravel-app/ causes nginx to look for files at /usr/local/nginx/home/laravel-app/ which presumably isn't where your files are. If you set your root directive to an absolute path such as /var/www/laravel-app/public/ nginx will find the files. Similarly you'll note that I added /public/ to the path above. This is because Laravel stores it's index.php file there. If you were to just point at /laravel-app/ there's no index file and it'd give you a 403.

413 Request Entity Too Large

Answer : Add ‘client_max_body_size xxM’ inside the http section in /etc/nginx/nginx.conf, where xx is the size (in megabytes) that you want to allow. http { client_max_body_size 20M; } I had the same issue but in docker. when I faced this issue, added client_max_body_size 120M; to my Nginx server configuration, nginx default configuration file path is /etc/nginx/conf.d/default.conf server { client_max_body_size 120M; ... it resizes max body size to 120 megabytes. pay attention to where you put client_max_body_size , because it effects on its scope. for example if you put client_max_body_size in a location scope, only the location scope will be effected with. after that, I did add these three lines to my PHP docker file RUN echo "max_file_uploads=100" >> /usr/local/etc/php/conf.d/docker-php-ext-max_file_uploads.ini RUN echo "post_max_size=120M" >> /usr/local/etc/php/conf.d/docker-php-ext-post_max_size.ini RUN echo ...

Cache A Static File In Memory Forever On Nginx?

Image
Answer : Nginx as an HTTP server cannot do memory-caching of static files or pages. Nginx is a capable and mature HTTP and proxy server. But there seems to be some confusion about its capabilities with respect to caching. Nginx server cannot memory-cache files when running as a pure Web server. And…wait what!? Let me rephrase: Nginx HTTP server cannot memory-cache files or pages. Possible Workaround The Nginx community’s answer is: no problem, let the OS do memory caching for you! The OS is written by smart people (true) and knows the what, when, where, and how of caching (a mere opinion). So, they say, cat your static files to /dev/null periodically and just trust it to cache your stuff for you! For those who are wondering and pondering, what’s the cat /dev/null reference has to do with caching? Read on to find out more (hint: don’t do it!). How does it work? It turns out that Linux is a fine-tuned beast that’s hawk-eyed about what goes in and out of its cache thingy. Th...