Magento Speed Optimization: How to make it insanely fast?

This article tries to be a complete practical resource on Magento Speed Optimization. It contains some very easy to implement recommendations to the most advanced ones.

There is much talk about how Magento is using abundant resources and how difficult it is to work with.
We’re trying to debunk some myths and to show a different side of things.

So let’s break it down piece by piece!

Caching

The cache acts as a buffer between the user and the web server. Serving a cached web page to a visitor is a lot simpler than unleashing the full power of Magento for every browser refresh.

Types of cache

Magento has lots of them in place as you can see below.

Magento 1.x:

Magento 2.x:

All are meaningful, but the most influential one is the full page cache.

Why is cache so important?

  1. It offers stability when lots of users are browsing the website.

Here is a LoadImpact test for a Magento website with all the caches deactivated:

At about 30 concurrent users the web server crashed.

Here’s the same website with all the caches active:

That’s more like it! The load time has been stable at just over 800ms.

  1. We get a way faster server response time, also known as TTFB (Time-to-first-byte)

Below we are comparing the response times for a Magento 2 website using different types of caches (tested with WebPageTest.org):

It looks like Varnish and Redis Full Page Cache engines are our best options. Both are using in-memory storage engines, which makes the cache serving faster than file-system options.

Even though Redis is supported as a cache backed since Magento 1.8, Varnish is available by default only in Magento 2.

If you’re running on Magento 1.x and you have your heart set on Varnish, there are modules like Turpentine (for newer servers) or Phoenix (for old servers) to help with the integration.

  1. It takes pressure off the server, which has a direct impact on your budget, which for medium to enterprise levels is an important subject.
  1. Cache is also crucial for SEO. In the log analysis article we talk about how it’s essential to make life easier for Googlebot if we want to get some good SEO traction.

Below is a server log chart with the response times experienced by the Google crawler.

The dark green color represents the fastest loading pages. You can see on March 26 how the amount of fast loading pages goes up:

You guessed it, that’s when we added the full page cache.

As you probably know, speed has been officially announced as a ranking factor by Google. There’s no escaping it at this point!

Advanced Tip: Warming up the cache

For the cache to work on all our pages, someone has to visit them first. Having a slow first-time view wouldn’t be the end of the world, but every visitor matters!

Would there be a way to offer the speedy browsing experience to the first person who sees the page? Of course, there is! We will visit them ourselves every night.

Well, not quite.

We can create a cronjob to do the work for us.

Our go-to method is to create a bash script that fetches every URL from the XML sitemap. Below is the code we are working with:

#!/bin/bash
URL='magentosite.com'

wget --quiet https://$URL/sitemap.xml --no-cache --output-document - | egrep -o "http(s?):\/\/$URL[^] \"\(\)\]*" | while read line; do
    time curl -A 'Cache Warmer' -s -L $line > /dev/null 2>&1
    echo $line
done

It would be best to run it at night-time when the traffic is at the lowest point.

Front End Optimization

JS & CSS

The Magento built-in functions look nice at first sight! In a past world, merging all the CSS files into one as well as the JS ones will do wonders for your frontend.

Instead of having the user’s browser sending like 50 – 100 requests, we would have only 2. On top of that, if we minify them, we could also benefit from a lower request size.

The present-day introduces us to new obstacles such as slow mobile connections, lousy https configurations, misconceptions about HTTP/2, and so on.

However, we always like to look at them as opportunities. So should you!

The trick is to test thoroughly and to look at the data objectively.

And that’s what we did!

Magento 1.X

In the admin we have the options inside System > Configuration > Advanced > Developer:

Magento 1 doesn’t have the minify functionality, so you will need to use a module (like this free one) or more advanced methods.

Now for the tests!

We have tested everything using Google Pagespeed Insights, WebPageTest.org, Google Lighthouse and Google Chrome via a Fast 3G connection.

On Magento 1 the test results seem to point in a merge and minify direction for both JS and CSS files:

Javascript and CSS code is very diverse in the real world. Our recommendation is to do the tests for your particular case.

If you want to use our visualization, click on it and download the workbook from Tableau Public.

And most important: make sure your website is working after you make the changes!

Magento 2.X

On Magento 2 we have an entirely different story.

It all starts with the introduction of require.js, which selectively loads Javascript modules depending on the page.

Here are the optimization options in the admin area: Stores > [Settings] > Configuration > [Advanced] > Developer

You can only see the section if you have the developer mode active ( bin/magento deploy:mode:set developer).

So for Javascript, we have:

  • merge: one main file for JS, but we still have separate requests from require.js
  • bundle: groups all javascript files into a few bundles (reduces the number of requests)
  • minify: removes the empty spaces from the JS files (reduces their size)

Note that using merge and bundle at the same time is error prone.

For CSS it’s straightforward:

  • merge: a single file all the CSS (reduces the number of requests)
  • minify: removes the empty spaces from the CSS (reduces the file size)

Results for every setting combination

*all of our speed tests have been performed under production mode

Here’s how we read the test results:

Again, this is not universally applicable. This is the best setup for our particular case. You need to redo all the tests, analyze their results and build a similar thought process.

HTTP/2 and CDN

HTTP/2: goodbye image sprites & parallelized requests!

The second version of HTTP is one of the most refreshing advancements of the web. Here’s a visual comparison to explain the difference in performance (thanks to Cloudflare):

Please note that HTTP/2 will run only over HTTPS. To check if it’s working for you, open up the Network tab in the browser console:

It would be such a waste not to take advantage of the benefits of HTTP/2 since most modern browsers support it.

If HTTP/2 is not an option, you should at least make sure you have keep-alive enabled. It makes a difference!

When to use a CDN (Content Delivery Network)

A CDN is always useful since it caches your static content such as CSS, Javascript, image files, fonts, etc. This way, we get faster response times for them.

Here’s an example with our logo:

I think the results speak for themselves! The image loading time is more than 20x faster.

However, the value a CDN brings is a lot more palpable if your visitors are spread across different parts of the world.

The network is using servers placed in strategic locations so the visitor will be served from the nearest one.

Also, most Content Delivery Networks are using HTTP/2. Since it’s a bit tricky to work with HTTP/2 on the server, a CDN could be the easy way out.

With options such as Cloudflare (free & paid plans) and Amazon CloudFront (free in the first year of use), there’s no excuse not to give it a go!

Useful tip: avoid using too many files from external sources like fonts, CSS/JS files – load them from your own content delivery network, where you have full control over them.

Quick wins that make a difference

Lazy loading images

Very useful for slow mobile connections.

Imagine you are on a product category with many products. It doesn’t make sense to load all the images at once.

The images should become visible as the visitor scrolls:

That is the smart way to go!

Lossless image optimization

The theme/CMS images, banners, and so on are often not optimized.

Compressing them without damaging quality is doable using tools like ImageOptim (for Mac), FileOptimizer (for Windows), Kraken.io (web service), JPEGmini.

Gzip: Compression for text files (HTML, CSS, JS, fonts, text, JSON, etc.)

To be able to use it we need help from a server module. On Apache, it’s called deflate on nginx gzip.

Below is a comparison test for a CSS file. Have a look at the difference in size!

Without gzip:

With gzip:

Gzip works well only for text files. Don’t try to enable it for images, zip files, etc. It won’t reduce the file size and the loading time could increase. Below is a purely informative test for a gzipped image.

There’s also an alternative to gzip from Google called brotli . It is available for both Apache and nginx.

From our tests, the results are not very convincing. Their size is a bit lower in most cases, but the browser seems to take more time to load the resources compressed with brotli.

From where we’re standing, brotli is not worth the headache.

Browser cache

Expires headers are handy since they allow browsers to cache content that doesn’t change very often.

They are essential for the users and also for SEO purposes. Googlebot renders every page, and it would be a drag to load the same static files every time it goes to a new page.

To illustrate the benefits, below are two simulated 3G connections. One with the browser cache active and one without:

The difference is huge!

Make sure you are not setting expires headers for dynamic HTML pages like checkout, login/logout, etc.

It’s a recipe for disaster!

Our recommendation is to avoid expires headers for HTML pages, and keep them only for static files like CSS, JS, images, and so on.

HTML minify?

The benefits of minifying HTML are minimal. If you are not experiencing any issues with the implementation, I would say “Why not!?”.

Async CSS

If you want to get close to 100/100 on Google Pagespeed tests, this one is a must. You might recognize it as the Optimize CSS Delivery warning.

To overcome this issue, we have to inline the CSS for the above-the-fold area of the webpage and load the rest “Asynchronously”.

Here’s how the loading differs:

There are ways to do it with a grunt plugin or a gulp one.

If you don’t need an extra headache on your server, you can use criticalcss.com. They even have an API to work with. It’s our go-to service!

External fonts

External fonts could become problematic if you load a lot of them. We’ve developed a tool to help download Google fonts locally.

If you have a CDN, you can use our method without any issues. If not, the speed could decrease. It would be safer to load the fonts from Google’s CDN.

So be careful when making the decision!

State-of-the-art server configuration

First of all, it’s not about the server’s hardware; it’s how you set it up.

We’ve seen plenty of situations where websites running on powerful machines have crappy performances. Paying for an expensive hosting doesn’t mean too much other than that.

So let’s see how we can set one up for high speed.

Web server: nginx vs. Apache vs. IIS

First off, Windows is not recommended, but it is possible.

How about between nginx and Apache?

The answer is usually both!

Our go-to setup is to use nginx in front of Apache like so:

  1. nginx as the first layer
  2. set nginx to process the static files (JS, CSS, images, fonts, etc.)
  3. use nginx as a reverse proxy to Apache
  4. leave it to Apache to handle the PHP processing

This way, we have them both doing what they excel at nginx for serving static files and Apache dealing with the heavy PHP load.

Here is an example set up tutorial for an Ubuntu server.

PHP

The main thing to keep in mind is that Magento will run a lot better on PHP7 compared to any 5.x version.

More than that, starting from version 2.2 it supports only versions higher than 7.

M1 is not compatible with PHP7, but there is a free community extension to help with that.

Of course, not all modules are compatible. Nevertheless, from our experience, it’s totally worth the hassle of dealing with bugs that will inevitably arise.

The second recommendation is to use OPcache, a PHP server extension that speeds up the code. To be able to use OPcache you have to be careful with what PHP handler you use because some of them are blocking the cache.

To test if it’s working head over to a phpinfo page and go to the Zend OpCache section. Navigate a page or two from the website and refresh the phpinfo page. If the Used memory field does not change it is very likely that OPcache is not working.

You will have to switch to a PHP handler that works with OPcache.

Last but not least, the realpath_cache setting. It has been quite the subject within the community!

We found an in-depth technical explanation of it here. Bottomline: it’s beneficial for platforms that are using many files.

Also, we should increase the realpath_cache_size to more than the deprecated default of 16k. The new versions of PHP have set the default at 4M.

Database

Above everything, the most significant win you can get for your database is to keep it clean and to uphold the queries to a minimum.

With that in mind, let’s move forward!

Replacing the standard MySQL with MariaDB or Percona could deliver better results.

These are MySQL forks which are meant to offer superior scalability. Both of them are using a storage engine called XtraDB which allows the use of high-availability clusters.

The best way to tweak the settings for your database is to use a simple tool like mysqltuner. It offers performance suggestions that we can apply right away!

Amazon infrastructure

If you are planning to use Amazon RDS to host your database, you might want to look into Aurora (a DB created by the folks at AWS).

They promise a 5x increase in performance compared to the standard MySQL.

Even though that’s not really the case in real-life situations, Aurora brings autoscaling to the table.

It might seem like “an Enterprise topic”, but the ability to scale automatically could be useful for lower to medium businesses as well.

Especially for those that are experiencing floods of traffic in certain seasons (Black Friday, Christmas, Sales periods, etc.).

Session storage

Magento offers the option to save visitor session data either using the File System or the Database.

Both options have at least one drawback:

  • if you want to store them on disk, the overall speed could become slower at high volume traffic, but you’ll have stability
  • if you store them in your database, you’ll gain some speed, but its size could rapidly escalate

Redis and Memcache to the rescue

Both are powerful in-memory storage engines which will improve performance for websites with lots of visitors. You get the best of everything: stability and speed.

In theory, Redis should perform better, but you can find out what’s right for you only by testing.

There are usually three steps involved to set up either one:

  1. installing Redis/Memcached
  2. installing the PHP extension that communicates with Redis/Memcached (optional in some cases)
  3. configuring Magento to store the sessions in Redis/Memcached

Redis guides

The Redis installation should be very straightforward.

Here’s are some examples for how to do it on Ubuntu and CentOS. These should only serve as inspiration since every server is different.

Don’t worry, we didn’t not forget the integration instructions from Magento for 1.X and 2.X!

Memcache guides

Here’s a guide from Magento on how to accomplish the setup on 2.x.

For 1.x versions, the process is pretty much the same. The main difference is that you will have to edit the Magento configuration from the local.xml file as opposed to env.php on M2.

Please don’t forget to migrate your existing sessions when moving to a new storage engine! You don’t want to end up with lost carts, logged out customers, and so on.

Advanced tip Using sockets instead of TCP connections if you have Redis on the same machine as PHP might give you a small boost in speed.

Server Infrastructure

One of the most sensitive topics out there for businesses moving towards the experienced area.

The easiest way would be to put everything we talked about on the same machine. Maybe even add the email server and an ERP as the cherry on top 🙂

Which is fine!

It might work well if you are careful with the limitations you set for each and every service.

However, at some point, you might conclude that every service has its own unique needs.

For example, the database might work better on a machine with lots of RAM, while the application (PHP) might require a compute optimized environment with better CPUs. An email server would work best as a separate service, while an ERP might need additional software to run.

If you mix everything in the same bowl, you might end up with a crappy taste at the end.

Inspiration

We are not going to say which setup is best because there is no such thing. Each business case is distinct, and it has specific needs.

We are just having a brief look at some popular architectures designed for Magento.

The Magento Enterprise one:

This one uses an AWS infrastructure from Amazon:

The following one focuses more on the instance type provided by AWS:

This one is a bit more on the abstract side:

A more specific one (mentions the type of web server, database, and cache engine):

Here are some conclusions we can draw:

  • you can break them down into two major categories: auto scalable and manually scalable
  • notice how all of them are using load balancers
  • all the AWS ones are using ElastiCache for Redis/Memcache
  • all of them are using database read replicas
  • some of them suggest having the admin area on a separate instance
  • some of them include a CDN
  • the Magento Enterprise architecture seems a bit lame if compared with the other ones

If you don’t have the budget for enterprise infrastructure, you could still use a solution like docker, to keep everything in its own container like in this example.

A few things that make an infrastructure great:

  • it’s highly available
  • it’s extremely reliable
  • it brings solid performance
  • it’s scalable (even better if it scales on its own)
  • it’s secure
  • has you covered for disaster recovery
  • it’s easy to use

Progressive Web Apps

If you are unfamiliar with this latest trend started by Google, you might want to read a bit more about it.

In a nutshell, PWAs allow users to use your website as an app without having to download it from the app store. At first, they were only available on Android devices, but Apple started supporting them as well.

A Progressive Web App is basically a high-speed version of your mobile website with an extra script, called Service Worker.

Wait a minute!

Why are we talking about Progressive Web Apps in an article about Magento Speed Optimization?

Mainly because the key to having a PWA is to obtain incredible loading times on slow mobile connections.

To make it more intelligible: the time to First Interactive should be under ten seconds.

The tool to help with the testing is called Lighthouse and it’s right under our nose, inside the Google Chrome console.

It might seem easy to obtain a 10-second timing at first, but it’s not. Not even close!

Not in a world where we have all kind of tracking and marketing scripts installed and where we use tons of high quality images.

PWAs for Magento

From what we see in the community, the trend is to built separate mobile websites using javascript frameworks.

There’s even a project called Magento PWA Studio based on ReactJS.

It’s true that Javascript frameworks are all the hype right now, but not everyone might be willing to put in the effort at this point in time.

The good news is that it’s doable. The bad news is that it’s not going to be easy.

Nonetheless, I think you should do it even if you don’t care about Progressive Web Apps! Because you should care about the users using mobile devices with slow internet connections.

The main things to keep in mind are:

  1. You have performance Opportunities inside the Lighthouse test. Use them!

  1. External scripts are killing your performance. You will have to test each and every one of them to see if they break your performance.

Most script creators don’t care about your speed, so don’t waste time dwelling on them.

  1. Optimize your images to the bone or you will encounter loading problems on 3G connections
  2. Use lazy-loading for images: see the advantages in the quick wins section of the article
  3. Using a CDN along with HTTP/2 will improve the speed according to our tests
  4. The days when we were hiding content from CSS are gone! Keeping DOM elements to a minimum finally makes sense.
  5. You might find it useful to avoid loading certain elements or features on mobile devices. That is not wrong! Just make sure you have separate caches for mobile and desktop.
  6. It will take time to reach a reasonable score, so arm yourself with patience!

Final thoughts

After going through all of this, you might see how it’s not that difficult after all. It’s just a lot to deal with. The thing is, you have to do to it one way or another.

The benefits will hopefully give you a great sense of satisfaction!

View Comments (2)

Related Post

This website uses cookies.