How We Made Our Website Fast


October 16, 2014

One of the core principles that we have at Retro Mocha is that we build native apps. We build native apps because the are usually faster and more responsive than something that isn't native. It's about delivering a better user experience. When it comes to user experience, fast is better than slow.

I wanted to take the idea of making things fast and apply it to the Retro Mocha website. Our website wasn't the slowest website on the internet, but it wasn't the fastest either. It probably took 0.75-2 seconds to load a page, depending on the page, internet speed, caching, etc.

My goal for this project was to make Retro Mocha's page load time drop to 0.25 - 0.5 sec per page load. That would put our site around the top 1% of all websites. It also would make page loads feel nearly instant on broadband connections and very fast on 3G mobile connections.

Where We Started

Just looking at our home page, we had 15 or so HTTP requests, about 300k worth of data we downloaded to render the page, and it took over a second on average to load the page.

This is a lot faster than many websites, but it is not fast enough for me. I want the website to feel instantly responsive on web and desktop.

The blog pages with social widgets and Disqus comments was around 75 HTTP requests and over 1.2 mb page size. It took at least a couple seconds to render most of the time.

Our website is static!

One thing that helped us in this process is the fact that our website is built using Middleman, a static website generator in Ruby. Static HTML websites have tremendous advantages over a CMS like WordPress in that they are very fast and efficient because you aren't spending any CPU cycles rendering the page or talking to a database. Also, because it is just HTML files, you can take those files and host them anywhere.

So, we were fortunate enough to start with our website being static HTML files that we hosted on Amazon S3. Middleman made that incredibly easy and that made some of our later optimizations much easier.

To host on Amazon S3, we use the s3_website gem which makes it incredibly easy to upload our website to S3 with a simple command. It also handles gzip'ing the files, caching, and Amazon CloudFront CDN, which all helped out a ton later.

Step 1 - Test your site

The first step in making your website fast is to instrument it with various site speed tools.

Pingdom has a great tool that will show you how long your page takes to load, what things you can improve, etc.

YSlow is a great tool for giving performance recommendations

Google Page Speed is another browser tool that can tell you some things to fix

The general approach I use is to get the highest score I possibly can on these benchmarks, but to realize when you can't make any more progress and be willing to move on to the next optimization win. Right off the bat a few things really jumped out at me - caching, extra HTTP requests, and file size.

In general, the fastest website is the one with the fewest HTTP requests, the smallest file sizes, and is cached where appropriate.

Step 2 - Add cache headers

Simply adding cache headers to your web pages can make a tremendous difference to the end user experience. The way it works is when your browser requests a page or file from the web server, if the file has a cache header attached to the response, the browser will save that file for reuse later.

So, if you have a shared CSS file and you tell the browser to cache that file for a week, it will only load that file on the first page load, every other page will reuse the CSS file the browser already downloaded. This saves bandwidth and HTTP requests and makes the end user experience much better.

To add cache headers with Middleman and s3_website, I added

    max_age:
        "assets/*": 604800
        "*": 300

to the s3_website.yml config file.

This will tell the s3_website gem to automatically tell S3 what the expires headers should be on the various files. All of the static assets - CSS, javascript, and images, are browser cached for a week (604,800 seconds) and all the individual pages are cached for 5 minutes (300 seconds).

To make this work even better, Middleman generates unique versions of static assets like images, javascript, or CSS files after each file change. That means when you change a file, you don't have to worry about invalidating browser caches. The latest version is always used by Middleman.

Step 3 - Eliminated unused files and Javascript

There were quite a few unused files that accumulated as part of the template. There was maybe a small CSS file we didn't need, a Google web font we weren't using. Stuff like that. Removing those saved a few HTTP requests and some download size.

One of the things I noticed was we were including Bootstrap's javascript and jQuery even though our site doesn't do anything dynamic or interesting with javascript.

So, goodbye javascript.

That saved probably 50-100k page load and a few HTTP requests.

Step 4 - Eliminated social sharing

Right now our site doesn't get a ton of traffic and doesn't get a ton of social shares either. However, it does get some traffic from Twitter for our work on Obvious.

What I noticed is that people who share our stuff socially aren't using the like or share buttons after each post. They were sharing our stuff naturally, which is better for us than slapping social sharing buttons all over the place.

After taking some time to think about it, I got rid of the social share buttons. They make a lot of HTTP requests and don't add a lot of value.

So, we got rid of those too. That saved us probably 30 HTTP requests.

Step 5 - Eliminated Disqus Comments

At this point, we had eliminated most of the extra HTTP requests and I thought we should be down to 10 or fewer HTTP requests on every page, but on our learn blog, we still had a lot and they were coming from Disqus.

We used Disqus because it's such a popular blog commenting engine that works everywhere, we thought it was a good thing to have on our site. Then I realized it added about 40 HTTP requests to the page load, just like the social share plugins did. Ugh.

To be completely honest, blog commenting doesn't add value on most posts because we don't have a lot of traffic right now. Even without traffic, blog commenting makes your site a target for web spammers to drop links.

We decided to kill our blog comments with Disqus. In its place we have a giant link to tweet at us a comment. I think it's a good tradeoff, it keeps our website fast and allows us to still have social interaction.

Between removing social sharing and Disqus comments, I eliminated about 70 HTTP requests and over 1 mb from the page size. That was a pretty big win for the blog pages.

Step 6 - Put the website on CloudFront

Our website already ran on Amazon S3, which is pretty darn fast to begin with, but I've read enough articles over the years about the speed advantages of Amazon CloudFront CDN that I went ahead and configured our S3 bucket to mirror to CloudFront.

It took a few hours because the instructions I started with didn't work with the way our pretty url's with index.html files in the root of the directories. Eventually I found a small setting that changed the way that CloudFront mirrors S3 and then it worked like a champ.

Now all the files are mirrored by CloudFront in a ton of different locations and it makes the site even faster and more responsive.

Step 7 - Switched from Bootstrap to PureCSS

This step took a bit of work because it involved changing a lot of the HTML. The problem was simply that Boostrap, while fantastic, was giving us a lot of features and functionality that we just don't need on our website. I already took out the Javascript side of things, but the CSS file was still like 100k uncompressed or about 18k with gzip compression.

PureCSS is a much smaller CSS library which gives us the bare minimum of what we need. When combined with our site styling, PureCSS is only 25k of CSS and about 5k compressed.

It's not a huge savings, but shaving 13k off of every page load and shrinking the amount of CSS rules that the browser has to match makes a difference. Every little bit counts.

Step 8 - Resized and optimized images

The easiest and fastest wins in terms of website speed and performance usually relate to images and image size. Basically, most images are the wrong resolution and don't have enough compression applied. That can mean large file sizes that slow down your website dramatically, especially on 3G mobile connections.

To fix our images, I went through and first made sure that we were using the right sized image. Many times you will be displaying a relatively small image, like 200x200, but the file itself could be 800x800. Web browsers will just resize the images for you, but they still download the large file. By resizing the file to the correct size, you can shrink the file size by 75% in the example above.

After you have the right sized image, you want to export it in a way that reduces file size while retaining an appropriate image quality. This varies from image to image, but I always export to web and for pictures I use JPEG at 50-70% quality depending on how it looks and where I'm using it. For images with transparency, use PNG.

Now, the next step is to run that file through other programs that will further reduce any extra file size that you don't need. on Mac, I use ImageOptim to shrink my image files. It works great for JPEG files and if I am using PNG files, I will run them through TinyPNG first, then ImageOptim. That workflow tends to give me the smallest file size.

In many cases I took files that were 50-100k down to 25k or less. Many are now 5-15k. That doesn't just save bandwidth, it makes the pages load faster and really improves the user experience, especially on mobile.

Step 9 - Removed Web Fonts

Web fonts are great and for the last year or so we've used Roboto from Google Web Fonts to make our site have a nice, consistent font. Roboto fit our style in a way that was cross platform and worked everywhere. However, it added about 5 HTTP requests and 50k or so to our page load.

After doing a little research, I was able to swap out Roboto for a Helvetica Neue based font stack that works pretty well on most platforms. The biggest problem I ran into was Windows. I had to use Segoe on Windows to achieve the same basic style for our brand. It looks pretty good in most browsers, but Chrome on Windows isn't quite as nice as IE or Firefox in rendering the font. Oh well.

Removing the use of web font makes our site that much faster and more responsive.

Other interesting things I learned

Surprisingly, with Google Web Fonts and Google Analytics, some of the files don't have as long of a cache expiration as Google recommends. They do this for good reason, but it's sort of bizarre to have Google's own services break their own performance guidelines.

Speed really impacts how a site feels. When you browse the web and so many sites now are junked up with ads, social sharing, videos that play on their own, etc. it is really nice to hit a site that loads fast. Even sites that I enjoy like The Verge are so filled up with images and video that they load slowly and feel sluggish.

Our computers keep getting faster, but we keep making our websites slower.

Making our website fast took a good bit of work, but we now have one of the fastest sites on the internet. According to Pingdom's website speed tool, our site is faster than 99% of sites tested. Our home page now sits at 5 HTTP requests, 21.4kB page size, and loads in about 0.3 seconds. Most of the rest of the pages on the site are now in the 5-10 HTTP requests, < 50k page size, and load in < 0.5 seconds. Our site is now 4-8x faster than it was before.

That's pretty fast.

- Brian




About The Author

Brian Knapp is one half of the dynamic duo that runs Retro Mocha. You can follow his hilarious puns on Twitter.