Fri, 29 Dec 2017
# Nginx to cache dynamic PHP(Laravel) pages. Make your website partly static and reduce response time.
Table of Contents
# Intro
There is a strong trend in web development toward responsive and performant user experience. The faster page loads, the more pleasant experience is for visitors. Today, we have to do our best to stay competitive in the eyes of our customers. The quality of a product is how it is perceived by users, and a slow opening website is far from quality one.
There are a vast amount of ways to improve the performance of web pages, starting from HTTP2, CDN networks, load balancing, PHP opcode caching, general caching and so on and so forth. In this article, I will show how to cache static HTML responses generated by Laravel app and serving them by nginx without PHP execution at all.
This trick works only for pages which are the same for any user - like login, signup, password reset page and similar pages. Even these pages may have dynamic content, like CSRF-token, which you may load via ajax request after the page is loaded.
# Prerequisites
This tutorial is just a proof of concept, learn from it but don't try to use it in production as is.
You need to be familiar with PHP (7.1+), Laravel framework (5.5), Nginx web server in order to find this tutorial useful.
We will write a custom middleware which will dump HTML responses to the disk, and we will write a command to remove stale pages.
Please install a fresh Laravel app and make sure you have an nginx up and running on your server.
# Writing a middleware
Let's write a so-called terminal middleware. This is a middleware which has logic to execute after the response has been sent. Execute this command:
php artisan make:middleware CacheHtmlResponse
# Writing a test first
Let's create a new test at tests/Http/Middleware/CacheHtmlResponseTest.php
like this:
<?php
namespace Tests\Http\Middleware;
use Tests\TestCase;
class CacheHtmlResponseTest extends TestCase
{
function test_middleware_creates_html_file()
{
$uri = '/a/b/c';
$response = "hello";
\Route::get($uri, function () use ($response) {
return $response;
})->middleware('cache-html');
$this->get($uri)->assertSuccessful();
$this->assertTrue(\Storage::disk('cache-html')->exists($uri . ".html"));
$this->assertEquals($response, \Storage::disk('cache-html')->get($uri . ".html"));
}
function test_middleware_creates_index_html_file()
{
$uri = '/';
$response = "hello";
\Route::get($uri, function () use ($response) {
return $response;
})->middleware('cache-html');
$this->get($uri)->assertSuccessful();
$this->assertTrue(\Storage::disk('cache-html')->exists("index.html"));
$this->assertEquals($response, \Storage::disk('cache-html')->get("index.html"));
}
}
The class contains two tests to make sure that middleware creates a static file after the request was served. As you can see it uses Storage
facade and checks files on the cache-html
disk. Let's update config/filesystems.php
file like this:
'disks' => [
// ...
'cache-html' => [
'driver' => 'local',
'root' => env('STORAGE_CACHE_HTML_PATH', storage_path('app/public/cache-html')),
],
],
//...
As you can see, we can set the root folder via ENV variable or a default location. This is especially useful for performing tests. Edit phpunit.xml
like this:
//...
<php>
//...
<env name="STORAGE_CACHE_HTML_PATH" value="./tests/tmp"/>
</php>
Here we set the location for cache HTML files during the test. Also, we want to remove all temporary files after the test is done. Edit tests/TestCase.php
file like this:
<?php
namespace Tests;
use Illuminate\Foundation\Testing\TestCase as BaseTestCase;
abstract class TestCase extends BaseTestCase
{
use CreatesApplication;
protected function tearDown()
{
parent::tearDown();
// remove temp folder after each test
`rm -rf ./tests/tmp`;
}
}
To make the test pass edit app/Http/Middleware/CacheHtmlResponse.php
like this:
<?php
namespace App\Http\Middleware;
use Closure;
use Illuminate\Http\Request;
use Illuminate\Http\Response;
class CacheHtmlResponse
{
/**
* Handle an incoming request.
*
* @param \Illuminate\Http\Request $request
* @param \Closure $next
* @return mixed
*/
public function handle($request, Closure $next)
{
return $next($request);
}
public function terminate(Request $request, Response $response)
{
// 1. Detect a relative path to put the request at
$path_parts = explode('/', trim($request->getPathInfo(), '/'));
$file_part = array_pop($path_parts);
$file = (strlen($file_part) ? $file_part : "index") . '.html';
$relative_path = implode("/", $path_parts);
// 2. Create a folder
\Storage::disk('cache-html')->makeDirectory($relative_path);
// 3. Put a file with response HTML
\Storage::disk('cache-html')->put($relative_path . "/" . $file, $response->getContent());
}
}
# Enable middleware
Now that we have implemented and tested middleware, we can enable it in the app/Http/Kernel.php
file like this:
//...
protected $routeMiddleware = [
//...
'cache-html' => CacheHtmlResponse::class,
];
And now we can choose which routes are static enough to be cached. For now, I will create two sample routes:
/test/cached
/test/not-cached
.
And we will enable the middleware for the first one. Edit your routes/web.php
file like this:
<?php
Route::get('/test/cached', function () {
return \Carbon\Carbon::now()->toIso8601String();
})->middleware('cache-html');
Route::get('/test/not-cached', function () {
return \Carbon\Carbon::now()->toIso8601String();
});
# Configuring nginx
Ok, now that we have finished with writing application logic, it is time to configure our server side. As we chosen to put our cached files to storage/app/public/cache-html
, we tell nginx to look for files in this folder before calling PHP engine.
Edit your nginx config file like this:
//...
location = / {
try_files /cache-html/index.html /index.php?$args;
}
location / {
try_files /cache-html/$uri.html $uri $uri/ /index.php?$args;
}
Then make sure config is valid, in my case on Ubuntu I execute this command:
sudo service nginx configtest
And then:
sudo service nginx reload
And the last step here is to link our public/html-cache
folder to the storage/app/public/cache-html
, execute this command:
ln -s /path/to/project/storage/app/public/cache-html /path/to/project/public/cache-html
# Quick result comparison
Let's review what was the difference in response times from both routes now. To remove any network related timeing, I will run this command directly on server in order to get response timings:
curl -s -w %{time_total}\\n -o /dev/null http://127.0.0.1/test/cached
0.001
curl -s -w %{time_total}\\n -o /dev/null http://127.0.0.1/test/not-cached
0.032
Since command shows response time in seconds, we have 1 ms of response time in the case of the cached page, and 32 ms in the case of dynamic PHP generated the page (it does not perform any external DB queries).
So roughly we have saved 30ms worth of time loading. Is it much? I guest any ms counts when we talk about user experience. Combine this technique with multiple locations for your servers across the globe to hit anything like 20-50 ms of your static HTML response time.
# Add console command to dump the cache
Now that we cache our responses we may want to dump old cache files time to time. Let's write a command to dump cache files older than 5 days:
find /path/to/project/public/cache-html/* -mtime +5 -exec rm {} \;
Put this command to your crontab:
crontab -e
# and append it in the end:
0 0 * * * find /path/to/project/public/cache-html/* -mtime +5 -exec rm {} \;
# then save it
# Final notes
The idea behind serving static files is anything but new. There are dozens of static site generators which allow you to make web pages and serve them statically. The middleware we have just created is just an Idea for you to think about performance.
If you have your forms cached then you need to update your CSRF tokens via javascript AJAX calls.
This website uses a similar feature.