Make your Laravel site 6x faster using Redis as caching server

How I accomplished 6x faster loading of Laravel site in 15 minutes.

I like when my story/tutorial is backed up with genuine examples from real projects. So for this one I will use Codingo Tuts site as foundation, and I will show you some pieces of code that were or are still active on the site.

Examples here are pretty general, so you can follow the story on your own and implement the solution.

You can find example code which follows this tutorial on Github https://github.com/codingo-me/laravel-redis-cache. There you will find all controllers, repositories and views with simplified layout.

Testing loading speed, memory usage and number of queries

The best option for this is Laravel Debugbar, it has all of the features I need and many more.

First, lets pull this package via Composer.


composer require barryvdh/laravel-debugbar

Now I will add service provider and facade to app.php:


//.....
    Barryvdh\Debugbar\ServiceProvider::class,
//.......
//....Facades...
    'Debugbar' => Barryvdh\Debugbar\Facade::class,

Package is now installed, if I modify .env file now and set APP_DEBUG=true I should see Laravel Debugbar.

It may happen that you got some error about missing class, just run composer dump-autoload.

If you are still seeing this error add these 2 routes to routes.php:

Route::get('/_debugbar/assets/stylesheets', [
    'as' => 'debugbar-css',
    'uses' => '\Barryvdh\Debugbar\Controllers\AssetController@css'
]);

Route::get('/_debugbar/assets/javascript', [
    'as' => 'debugbar-js',
    'uses' => '\Barryvdh\Debugbar\Controllers\AssetController@js'
]);

DebugBar results of non-optimized site

Now I will test home page and see how site stands.

Without any caching or optimizations

DebugBar results without any optimizations

From debug bar we can see that request used 10.5MB of memory and it took 121.11ms to load entire page. Laravel rendered 5 different view files: main, home page, 2 sidebar views and footer view. It also completed 6 database queries to load menus, trending posts, home page posts etc.

I know I could use eager loading to reduce number of database requests, or cache those queries and restore results from cache. I was thinking how to reduce this silly number of requests, views and load page with less memory and time. This is so simple site right now, so this number of requests seemed like too much to me.

Laravel caching drivers

Laravel posses built-in support for 5 drivers: apc, file, database, memcahed and redis. The easiest one to setup is the default file driver. It will store pages, arrays and anything you cache inside storage directory. Database doesn't look to me like reasonable solution, when I want to minimize database queries.

At the end only 2 options left: memcached and Redis. I have some basic memcached experience and Redis was completely new to me. Redis documentation is great and easy to follow.

While I was calculating which one is better for my usage scenario, I was having few standards on my mind. I want to be able to choose which type of pages I want to cache. I mean this is some kind of blog and it has home page, post page, category pages, tag pages, author pages etc. If I change only 2 posts content and I don't change their categories then I only want to regenerate posts pages.

Secondly, I want to configure everything with as few as possible modifications/commands and all the other boring stuff.

From Laravel site I noticed Cache Tags section, and those tags will allow me to tag (cache tag) pages as: post, category, tag (post tag), author or home.

Installing Redis

I am using Ubuntu server and installing Redis couldn't be easier.


sudo apt-get update
sudo apt-get install redis-server

Now I will check is it running:


redis-benchmark -q -n 1000 -c 10 -P 5

This is redis-benchmark script and this command will run 1000 requests, over 10 parallel connections and pipeline 5 requests. Results will look like this:


    PING_INLINE: 200000.00 requests per second
    PING_BULK: 249999.98 requests per second
    SET: 333333.34 requests per second
    GET: 333333.34 requests per second
    INCR: 499999.97 requests per second
    LPUSH: 249999.98 requests per second
    LPOP: 249999.98 requests per second
    SADD: 249999.98 requests per second
    SPOP: 333333.34 requests per second
    LPUSH (needed to benchmark LRANGE): 249999.98 requests per second
    LRANGE_100 (first 100 elements): 66666.67 requests per second
    LRANGE_300 (first 300 elements): 13888.89 requests per second
    LRANGE_500 (first 450 elements): 9009.01 requests per second
    LRANGE_600 (first 600 elements): 6535.95 requests per second
    MSET (10 keys): 47619.05 requests per second

Before using Redis with Laravel, you'll need to install predis/predis package via Composer.

composer require predis/predis

This is enough to be able to connect Redis and Laravel now. The only thing left is changing CACHE_DRIVER key in .env into redis. Default port should be fine, if you modified that port just update it in config/cache.php.

How to cache pages - the quickest way

This site posses small number of pages right now, so I can cache entire pages and that won't affect Redis server, it will be very responsive. I also worked on couple of sites with over 50.000 pages stored in the database, and this option would not work for them.

Only option how content can be changed on this site, is by moderator adding new content, users can not generate any content, comments are loaded by Disqus and everything else is static.

So I will create one dirty trick for caching content, an artisan command which will accept page type as argument and fill the cache. That command could be activated from terminal, or after some action in admin panel.

There is always an option that something is not in cache and I want to load that content from the database.

Retrieving item from the cache

Retrieving cache item uses same notation as getting some session variable.


$item = Cache::get('key');

In my case, this key will be the name of the page. It this key is missing from the cache, get method would return null.

I can also check for existance of that key in cache with:


Cache::has('key')

But why would I bother the system with 2 requests, first check does key exists and second request to retrieve it. When I can just retrieve it, and check is it null.

So system first needs to check does this page exists in the cache if it exists, then retrieve it from the cache, else get if from the database.


    public function getIndex()
    {
        // Check cache first
        $page = Cache::get('home');
        if ($page != null) {
            return $page;
        }

        //Get from the database
        return $this->render->getHome();
    }

Cause entire page is stored in the cache, returning this item from cache directly to user, will work without any issues. This last line is returning view mixed with values from the database and rendered. Something like this:


    public function getHome()
    {
        $posts = Post::with('author')->orderBy('created_at', 'desc')->get();

        return view('pages.home', compact('posts'))->render();
    }

Now if I access home page, it will still load data from the database cause nothing is stored in the cache.

Storing pages in the cache

As I mentioned main reason why I decided to use Redis over memcached are cache tags. They will allow me to re-generate cache for posts only, or for categories only, or home page only. Tags are also good cause if I am having 400 items in the cache, needed item will be retrieved faster when I first pass tag name and after that item name.


    Cache::tags('tag_name')->put($key, $content, $duration_in_minutes);

This snippet is pretty clear, one thing to keep in mind is that when this expiration time ends, that item will be removed from cache. So in order to keep using cache, you should put some longer duration here or have some command which will regularly fill the cache with fresh values.

In my case I scheduled this command to run on daily basis in Kernel class like this:

    protected function schedule(Schedule $schedule)
    {
        $schedule->command('tuts:cache all')
                  ->daily();
    }

As I mentioned earlier, I am filling the cache with artisan command. And for that command I created dedicated repository which holds methods for storing pages in the cache. It also has main entry point with switch statement. That statement accepts argument from artisan command and according to that argument it only caches certain page types.

Storing home page looks like this:


    private function storeHome()
    {
        $this->putInCache('home', $this->render->getHome(), 'home');
    }

    private function putInCache($key, $content, $tag)
    {
        Cache::tags($tag)->put($key, $content, 43200);
    }

Generating post pages is very similar:


    private function storePosts()
    {
        $posts = Post::all();

        foreach ($posts as $post) {
            $this->putInCache( $post->slug, $this->render->getPost($post), 'post' );
        }
    }

Cause I am using artisan command I can generate these pages using console command:


    php artisan codingo:cache posts

There are thousand of different options how I can optimize this further, but this is just to get you into this matter.

Results after configuring Redis cache

Page loading time is greatly improved with these modifications, checkout.

With Redis caching

DebugBar results with Redis caching

This optimized version of home page is using 2 times less memory, zero database queries, zero views rendered, and loading times are moved from 121ms to 19.5ms. When you checkout these results you will see that loading speed is 6 times better now. On real server it is even better.

All of this is accomplished without tempering Redis or Apache configurations. There are many options which can greatly improve even this 6 times faster loading. I will talk about this soon.

Extra: Redsmin, Redis GUI with 1-step installation

I am using Redis for queues and caching almost all the time, and I find built-in redis-cli monitor a bit useless. I wanted graphical interface like Sequel Pro for databases. After many tries I was finally satisfied with Redsmin

It's online based GUI tool, and for it to work I only needed to copy/paste one command into the terminal. I am using Ubuntu OS, so for me that one command is

npm install redsmin@latest -g && REDSMIN_KEY=57ebea73cfhfhhf02455f86 redsmin
Redsmin Setup Instructions

Redsmin Setup Instructions

OK, actually these are 2 commands but you get the idea ;)

First part installs redsmin package and second part invokes it with dedicated key which is different for each instance.

At the same time when this second part is executed you need to click Connect this instance to Redsmin button. Communication is in real-time via sockets so it's needed that both parts are active: browser with approved connection and server package. The terminal where you are executing this command will be blocked by Redsmin, if you stop it with CTRL+C, connection to Redsmin site will be broken.

Once when connection is established, you'll be able to see almost all parameters of Redis instance. This includes all keys and databases inside.

Redsmin info Tab

Redsmin info Tab

I am using it usually when I debug, Queue - Worker chain, to check in real-time how workers are progressing etc.

Codingo Tuts Redis Cache

Codingo Tuts Redis Cache

Redsmin for Ubuntu requires Node version 6+, so if you are having older version you'll need to update it with:

sudo npm cache clean -f
sudo npm install -g n
sudo n stable

These commands will install latest NodeJS version using n helper package.

Online computer science courses to jumpstart your future.
WP Engine Managed WordPress Hosting

Trending

Newsletter

Subscribe to our newsletter for good news, sent out every month.

Tags