How I built this site

My site has being redesigned 3 times, and to no one's surprise, I've done it again. My old page was difficult to update, was lacking a consistent design and an interactive necessary part today trendy pages all have.

Most of the core features and functionality of the site are the same. But one thing that really excites me is the process I used to reimagine my brand and create a user experience that reflects all I've learnt in the previous months. Here's a summary of what I did, so you can see the exact steps I followed.

JavaScript all the things!

My last site used Jekyll, which is a powerful static page generator but lacked interactivity and better ways to test new changes on the staging phase. I've switched to Phenomic, which is another static page generator but uses React on the back. This framework is awesome and allows this page to reuse more than the 80% of code you see in front of your screen via components and templates.

I started by following the Getting started docs on the Phenomic page which were well written and explained all the key concepts + how it works on the back. Once I created a basic template and made the first Home page I realized I needed SASS support and CSS modules, which weren't included in the base React preset.

All I did to fix this problem was to create my own webpack config and copied the base config the preset used. Now, I could further customize the base config and I added some bits to it:

SASS support

SASS allows you to create stylesheets without duplicating anything by using variables, mixins and partials. Variables are just that, variables (which you can find on any programming language). Mixins are functions which can create new elements and wrap CSS content in @media queries, for example. Partials allow you to reuse CSS by chaining together all the elements that @extend it in a single statement:

%base {
  color: #ccc;

.button {
  @extend %base;

.label {
  @extend %base;

will produce:

.button, .label {
  color: #ccc;

Later, I added the following bits to my Webpack config:

import path from 'path';
import ExtractTextPlugin from 'extract-text-webpack-plugin';

const config = {
  module: {
    rules: [
        test: /\.scss$/,
        loader: ExtractTextPlugin.extract({
          fallback: 'style-loader',
          use: [
              loader: 'css-loader',
              options: {
                importLoaders: 1,
                modules: true,
                minimize: isProd,
                localIdentName: `[hash:base64:5]${!isProd ? '_[local]' : ''}`
              loader: 'fast-sass-loader',
              options: {
                includePaths: [path.resolve(APP_DIR, 'sass')]
  plugins: [
    new ExtractTextPlugin({
      filename: 'styles.css',
      disable: !isStatic

First, we need to import the ExtractTextPlugin which will create a separate css file with all the styles we've used in our code. Then, we declare a new rule for .scss files and extract all the contents produced by css-loader (which has CSS modules support enabled, custom class names and automatic minimization) which then runs through fast-sass-loader. This loader is better than the standard sass-loader because it caches all the compiled resources resulting in far faster compile times. I've even contributed to this project by updating it to Webpack 3 and adding standard data config value support (You can see my PR with more details here).

Finally, we need to declare the ExtractTextPlugin at the plugins config section which will extract all our styles into the styles.css file. Also note that a lot of config options such as class obfuscation or extraction are only enabled on production.

PWA goodness

Newer navigators allow sites to run custom JavaScript when the site isn't loaded and this opens up a world of possibilities such as push notifications, offline support... These JS files are called service workers and I've used them to cache all the static files to allow offline browsing support. Unfortunately, Phenomic doesn't come with this feature out of the box, so I had to implement some of it by myself. Here it is:

First, I added the SWPrecacheWebpackPlugin into my Webpack config:

  plugins: [
    isProd &&
      new SWPrecacheWebpackPlugin({
        dontCacheBustUrlsMatching: /\.\w{8}\./,
        filename: 'service-worker.js',
        minify: true,
        navigateFallback: `${publicUrl}/index.html`,
        staticFileGlobsIgnorePatterns: [/\.map$/, /asset-manifest\.json$/]

This plugin will cache everything except files that match the staticFileGlobsIgnorePatterns regexes. We only want this plugin to load on production as a dev enviroment shouldn't cache any stuff.

Then, we need to manually register a service worker. All I did was to create an extra JS file called registerServiceWorker.js that contains this function which I later import and call from my main index.js file.

export default function register() {
  if (
    process.env.NODE_ENV !== 'production' ||
    typeof navigator === 'undefined' ||
    !('serviceWorker' in navigator)
  ) {

  window.addEventListener('load', () => {
    const swUrl = `${process.env.PUBLIC_URL}/service-worker.js`;

    navigator.serviceWorker.register(swUrl).catch(error => {
      console.error('Error during service worker registration:', error);

First, we check if we are on production and if the navigator supports the service worker API. If not, we just don't register the worker. Then, we have to wait until the navigator is fully loaded, and finally, we register the service worker created by the SWPrecacheWebpackPlugin.

We also need a manifest.json file at the root of the server that declares how the site's gonna be seen as a standalone app when there's no internet connection and additional features such as the navbar's color on mobile browsers like Chrome.

The Google Developers site has a wonderful explanation on how to create them and they even have a generator that does all the dirty work for you. Finally, we need to define the manifest file inside the HTML <head> tag as so:

<link rel="manifest" href="/manifest.json" />

Lighthouse audit results showing 9 passed items

That's it! You now have a fully working PWA enabled site! Try to run it through Lighthouse and you will see a lot of things have improved with just ~30 lines of code.

More stuff!

This post would be too damn long if I included an explanation about every npm package I've used, so here's a list of the ones that have saved me a lot of time:

  • classnames (npm) is an utility to join classNames together based on conditions.
  • react-helmet (npm) manages all the document head changes for you. Don't worry about having two different <title> tags anymore.

Much needed redesign

I've been closely following UX and UI topics on Medium for a while, and I think every developer should at least know the very basics about making something usable and eye pleasing at the same time.

In order to achieve this gorgeous new look (fight me if you think it isn't!) I used three key concepts through the entire building process:

Reusability in mind

I chose SASS early on for a reason, it allows you to define basic variables you're gonna need all over the place and makes things such as mobile-friendliness painless by writing a simple 4 line mixin.

Here are all the things I declared on a _variables.scss file:

  • Font sizes: headers and content should always have the same text size in order to improve readiability.
  • Colors: I chose a primary, secondary warning and gray color palette that I reuse in every component. Nobody likes to see 400 colors in front of their faces! I limited myself to a light and dark variant for every of those colors which I use on :hover and when things get triggered/activated.
  • Spacing: As you can see, the margins and paddings on this site always are the exact 6 same ones. I wanted to have a clean grid design and this decission simplified a lot of it.
  • Viewports: I declared small, medium and large viewports which I can later integrate with a minWidth and maxWidth SASS mixin to achieve the same viewport changes are the same on all the different layouts. I chose 576px, 768px and 1200px, which are the same the Bootstrap framework employs. Bootstrap is widely used on a lot of sites (>16%! according to and all the decisions they take are carefully tested before they get merged into their master (main) branch.
  • Miscelaneous stuff such as border radii, shadows, z-indexes...

Consistency is the key

One of the reasons why I love React is its component nature which allows you to reuse simple elements such as buttons, icons or cards. I've followed the Atomic Design methodology and it has yielded great results. In short, this technique uses real life elements such as atoms, molecules and organisms to explain design:

  • Atoms are the smallest functional unit, they are basic HTML elements such as form inputs, buttons... that can't be broken down any further without loosing their meaning.
  • Molecules are relatively simple groups of UI elements functioning together as a unit. For example, the navbar in this site is made out of a series of page links.
  • An Organism is a relatively complex component composed of groups of molecules and/or atoms. Let's take my home page as an example: the Projects section is composed of cards (organisms) which contain a small button, an image, headers and text.

Simplicity rules a.k.a KISS

As you can see, most of this page is whitespace. While you might think "omg so much wasted space pls fix", our eyes can only focus on a certain number of items at the same time. Adding distracting moving elements and/or creating elements which make the entire page to repaint moving all the content further down deteriorate the UX and will make users to leave your site once they've scrolled 25 times in search of the only paragraph they initially were interested on which they found on Google.

The only changing components are the console carets shown at the end of the page title, and I even thought of removing them...

I also tried to make content as concise and short as possible because I don't want to waste your time.

What did you focused on the most?

Having Google Analytics on your site is a powerful way of knowing what your users are struggling to understand or get to work. I saw the mean time spent on specific pages was pretty high based on how long they were and I've tried to improve them this time. So far, I've only received positive feedback from people (on Twitter and other social media platforms) and robots (objective evidence of my experiment).

How did you move the old blog posts from Jekyll to this site?

One of the great capabilities Phenomic has (and probably why I chose it!) is Markdown file rendering which you can later gather by creating containers which query a database with all the collections you have. First, I imported some Phenomic functions right on top of my BlogPost component (which renders the page you're seeing right now):

import {
} from '@phenomic/preset-react-app/lib/client';

Then, instead of exporting my BlogPost React functional stateless component, I wrapped it around the createContainer function which is in charge of asynchronously loading the container props based on the passed props from the parent props. Essentially, it maps an id to the blog post content, date and title. Here's how:

export default createContainer(BlogPost, props => ({
  page: query({ path: 'posts', id: props.params.splat })

As you can see, the query function takes a collection and performs a search on all the files inside the content/<collection_name> directory. And how do you set each file props? you ask. The answer is quite simple: the same way Jekyll handles it, by using Front matter blocks. These are the initial section of Markdown files which start and end with --- which contain YAML or JSON defining some props.

The good (or bad?) thing is that Phenomic doesn't force you to use predefined variables, so you have full control on how you want your variables to be called or what they can contain. For example, in order to divide the old posts from the newer ones that are yet to come, I set the old tag to true in all the old posts instead of querying 2 different collections which could become a bottleneck in the future if I write a lot of posts.

Where are you hosting this site?

Before thinking about redesigning my site, I noticed the web routing structure I had in place was really messy and a 1 second time to first byte was pretty common for non-cached requests. Previously, when you connected to this site you went through the nearest Cloudflare POP to you and then that POP requested files (if they weren't cached) to a collection of OVH VPSs located in France. Then, depending on what service you wanted to visit (my site, GlobalStatus, PlayManager, payments, etc.) a Nginx instance would serve a static file/PHP processed file or redirect you to a GitHub Pages repo that contained all the static files generated by Jekyll. As you can see, establishing at least 2 connections before a file could be served wasn't optimal (see the graph below) so I decided to remove CloudFlare from the equation for static pages and I chose to create a series of subdomains that directly sent traffic to the French load balancers while still passing through Cloudflare.

Previous times to first byte

I had used Netlify before and I instantly fell in love with its features, interface and easy to setup GitHub integration. I've found these components really useful when using their service:

  • Global CDN: this is one of the reasons I could remove Cloudflare from the equation. They handle DDos mitigation, automatic scaling and much more for me plus their uptime is great.
  • Painless HTTPS setup: you automatically receive a free Let's encrypt HTTPS cert by making one click. Before, I had a crontab task that broke 2 times in a row when my certs needed to be renewed and therefor caused downtime. Not anymore.
  • Staging phase: Netlify allows you to stop new deployments and stick to the current working commit. Subsequent commits to the master branch won't get published, but Netlify gives you a custom URL which you can visit to preview the new changes.
  • Compiles my site on their servers: the fact I don't have to commit a node_modules folder which weights megabytes (mainly because it contains development tools such as webpack) and they always have the latest version of Node.js is pretty awesome and saves a lot of time.
  • Asset optimization: although Webpack takes care of minimizing JavaScript and CSS, Netlify makes my images smaller and prettifies my URLs.
  • Custom redirects: as I said before, I decided to move every independent service to its own subdomain ( ->, but I wanted to keep old redirects in place because Google and social media posts would still be linking to the old URLs. With Netlify, you create a _redirects file and you create redirects in a key-value fashion. They even support custom URL params and splats (*).
  • Setting headers: Netlify gives you the option to send per-route headers. I just used it to send security headers such as CSP or Xss-Protection to every route.

In order to have a downtimeless deploy of the new site, I needed to have the new site on Netlify correctly setup and a DNS change on the Cloudflare dashboard would be sufficient. You never know if something's broken before the moment of truth; but it was a perfect deploy, no downtime at all (yay!).

So that's all the fundamentals and looking back on it, my site definitely needed a redesign plus it was one of the most enjoyable projects I've tackled which have actually reached the production phase (which is about 1/10 of all the things I do on my free time). This all took me 5 days to do, but some bugs on external projects slowed me down on the process of deploying my new site. I want to thank the Phenomic team for their awesome static site generator and Netlify for offering all of the above for free.

© Hugmanrique. Made with