Guest Article

The “right” way to render your Jamstack site

Learn how and when to render your site using JAM!
Google+ Pinterest LinkedIn Tumblr

Ben Holmes is a fulltime developer – designer – blogger – person exploring how the web works. In his spare time, he loves arguing why CSS is a programming language, drowning in Japanese city pop, and teaching the world through code. You can find Ben on his Twitter, Linkedin and personal website.

Ben Holmes

The Jamstack is all the rage in our frontend renaissance era. But we’ve definitely hit a buzzword bonanza in recent years… we have client-side rendering, server-side rendering, static site generators, deferred rendering, incremental builds, antidisestablishmentarianism, where does it end?

It’s time to stop the madness. Let’s explore:

  • Where we started with client-side rendering
  • How static builds form the backbone of Jamstack
  • What hybrid approaches like serverless functions and deferred rendering can add
  • Which options you should ultimately choose

Onwards!

First off, what is the Jamstack?

Well it’s not this.

…But it is a pretty tasty architecture!

The “JAM” in Jamstack stands for JavaScript, APIs, and Markup. It’s a pretty vague combination of tech that encompasses all sorts of modern frameworks. According to Jamstack.org, you’ll just need to hit these 2 core features:

  • Pre-rendering – Anything you serve to the end user should be processed ahead-of-time in some way. This could mean building all your pages in one step like a Jekyll blog, processing pages on-demand using builders, or letting the client fetch everything it needs at runtime. We’ll touch on these approaches and more soon enough!
  • Decoupling – The data storage / processing layer should be separate from your frontend UI. So no more server templates if you’re used to traditional architectures. For agile teams, this is a huge win for local development setups (no need to spin up a database to debug your frontend!) and separating concerns across divisions.

This all sounds great on paper. But as you can imagine, there’s a lot of nuances to consider when building your own site. That’s why we’ll be focusing on the most nebulous dimension of the JAMiverse: how and when to render your site.

Client-side Rendering (CSR)

If you rode the wave of frontend frameworks back in 2015, you probably set up a site with some type of client-side rendering. create-react-app comes to mind here. Scaffolding a quick frontend with this tool, you won’t find much in the HTML document:

<html lang="en-US">
  <head>...</head>
  <body>
    <div id="main"><!--app goes here!--></div>
  </body>
</html>

The app actually one big JS bundle, ready to invade that humble div on page load. This is why we call this “client-side rendering,” since the client (the end user’s web browser) is in charge of rendering your app from scratch. You can think of this like going to your kitchen to whip up breakfast. The UI cereal’s in the pantry and CMS-data-infused milk is in the fridge, but you (the client) needs to assemble that bowl in the morning.

Pros and cons

This approach comes with some serious pain points:

  • You need to fetch all your data client-side. This could be a security risk if you’re, say, pulling data from an external API like Twitter. In order for the client to connect to those data sources, you might ship your API key for anyone to find and abuse.
  • The user has to wait for everything to load. This hurts accessibility on low-powered devices or poor internet connections, since the user’s browser is doing all the heavy lifting. 

CSR doesn’t have many benefits other than, well, it’s the simplest to set up from scratch. And for ultra-dynamic experiences like dashboards or maps, CSR may be good enough to render a unique experience for every user. But there’s a reason create-react-app isn’t usually the tool of choice for building websites these days. Which leads us to…

Static rendering

This is where the Jamstack’s “pre-rendering” pillar comes into play. We’ll introduce this concept with an analogy: say you’re working at a McDonald’s and you’re about to hit the lunch rush. How do you prepare that kitchen to pump out burgers as fast as possible?

Well, you could get your ingredients and build each burger from scratch when orders come through. This is how a traditional, server-based architecture might do things:

  • Hit the endpoint
  • Query some data from a database
  • Spit out a finished hamburger HTML page

Sure, we could build each burger individually like this. But we’re McDonald’s! We know that the Big Mac and large fries sell like hotcakes, and we could easily make some batches ahead-of-time to satisfy those orders.

This is how static rendering works. Instead of cooking / building orders on-request, you build everything up-front so it’s instantly ready for the end user. Sure, this means you’ll need to rebuild the whole site if you tweak your recipe, but it saves the user a lot of waiting in the long run!

Aside: Okay, server-based architectures don’t need to process every request from scratch. You’ll often use caching to hold onto responses for a few minutes at the time, ready to serve to the next set of users. The Remix team has a great overview on HTTP cache headers over here for extended reading!

From Jekyll to Next, Nuxt, SvelteKit and more

This approach is arguably where the Jamstack all began. Jekyll led the charge as a simple, no-frills static site generator that could process templates like Markdown and Liquid, combine them with a data source (ex. a headless CMS), and turn them into user-facing HTML documents. In the end, your “app” doesn’t need a server or even JavaScript to work! As long as you have hosting to dump your directory of HTML pages and miscellaneous assets, you have a site that’s ready-to-browse!

I’d still recommend this SSG (and similar spin-offs like 11ty and Hugo) for any site that doesn’t need much interactive JS. But these days, many developers have grown used to component-based frameworks like React, Vue, and Svelte for adding dynamic elements. So to get the benefits of building everything up-front while also allowing components to do crazy interactions and animations in the browser (CSR-style), you’ll need hydration. 

I’ve written on this subject in the past, but here’s the quick play-by-play:

  1. Your app exists as a big JS bundle of components, much like a create-react-app.
  2. The static site generator renders this bundle to a set of HTML documents at build time. Now, instead of shipping that empty <div id="main"> to the user, you can send the entire page’s markup for a speedy page load.
  3. Now, your browser looks at the HTML page that’s already been generated and compares it against your tree of components. When it finds a component that does something JavaScript-y, it’ll slot that JS into the existing HTML without re-rendering the entire page. In other words, you hydrate your markup with components.

This is what meta-frameworks like Gatsby, Next, Gridsum, and SvelteKit can do for you automatically. If you’re rolling your own solution though, you’ll need to research how you can statically render and hydrate your app to get the benefits of static generation.

Pros and cons

This approach is much nicer than CSR, and even trumps traditional servers in a few ways:

  • Static sites are the cheapest to host. If you built your site with a traditional server, you need somewhere to host that code you wrote and keep it running 24/7. But with static sites, you just need somewhere to host that build output of HTML documents and miscellaneous assets. Anything from Netlify to an S3 bucket can handle that!
  • Static sites have the shortest load times. Again, since you’re just serving those ready-made burgers, the hungry customer can get their meal in a flash.

There’s a major con worth considering though: building and processing your whole site up-front can take a looooong time. Sure, modern SSGs offer “incremental” builds to cut down on those times (ex. if nothing changed on my /about page between the last build and this one, don’t re-process that page). But if you make one tiny tweak to the footer used across the site, get ready to process hundreds, thousands, even millions of pages at once. If only there were a way to build our most critical pages up-front, and wait to build the others until the user requests them…

Splitting up the static build

There’s a few ways to tackle the build-time problem. Ultimately, your choice comes down to the frameworks and hosting solutions at your disposal, but let’s see the buffet of options available as of 2021.

Option 1: Combination of static and server-driven pages

This is the first option I’ve seen in the world of JS-driven Jamstack technologies. The concept is simple: say you have some pages that serve a similar experience to all users, and others that serve different content on every request (aka some are happy with the Big Mac, and others want a made-to-order burger). We can split off with:

  • Static pages getting processed at build time, served by a static HTML document
  • Dynamic pages getting processed by-request, served by either a standalone server or a serverless function

Frameworks including Next, Nuxt, and SvelteKit make this split virtually seamless. Just tag which components should get turned into static pages and which should be served serverless-ly, and the bundler figures out the rest! This isn’t too difficult to solve manually either, as long as you set up your server and HTML documents under the same domain. Single page apps can throw a monkey wrench in that plan though (which is beyond the scope of this article).

There’s also the caveat of hosting costs with this approach. Since some pages are now processed by-request, you need to host your server or serverless functions separately to field all those requests. AWS Lambda is the natural choice for this sort of thing, but the costs will usually be higher depending on the number of visitors.

Option 2: Incremental static rendering (ISR)

This is pretty similar to our “regular” SSG setup, but with a twist! Here’s the step-by-step:

  1. On deploy, we build just the most critically important pages of the site. These are the Big Macs and large fries of your domain.
  2. When the user requests anything outside that critical group, we serve them the version of that page from some previous deploy. Think of this like ordering an unpopular grilled chicken sandwich; we’ll give you the sandwich from the last batch that’s still good enough, but we’ll start preparing a new batch in the background to give the next customer something fresher.
  3. When another user comes along and requests the same page from step 2, we serve them a fresh page that we built on-the-fly. Remember, that last customer made us spring into action and whip up a fresh batch of chicken sandwiches, so we are ready for this customer (and all future customers) as they come along.

That second step is the only pain point, since that user gets stale content that doesn’t match our latest deploy (also known as “stale while revalidating”). We’re really using these visitors as “triggers” to decide when to rebuild lesser-used parts of the site. This way, a majority of traffic will get the latest content just as fast as your typical up-front build.

NextJS is leading the charge on built-in ISR right now, but you’ll be limited in where you can host as of 2021.

Option 3: Distributed persistent rendering (DPR)

This is identical to ISR with one key difference: we never serve stale chicken sandwiches.

DPR assumes users don’t want that stale content, and would rather wait for the server to finish building the page before you see it. So you can keep the same mental model we established in the ISR example. Just know that, at step 2, the customer would wait for a fresh batch to get made before receiving their order.

Netlify coined this term just recently in conjunction with their “on-demand builders” product launch. In short, they’ll help you set up builder functions to serve these “deferred” chicken sandwiches on-request, and cache the result for all future users until you trigger a rebuild. 

Still, make no mistake! This concept of “building when you need it” has been around forever. Chris Coyier points out how WordPress has pulled off something similar for a while, though not as sexily as a NextJS or SvelteKit might!

We’ve talked about the pros and cons for each. Now help me choose!

Alright, let’s break down our findings:

  • If you’re on one of these fancy hosting solutions like Netlify or Vercel, give static site generation + deferred rendering a shot! NextJS’s “incremental static rendering” solution is great for kicking off deferred content to later builds, and should work across either of these hosting solutions. 11ty also has some promising demos worth trying if you’re building your site without JS-heavy frameworks.
  • If you’re in the realm of raw AWS deploys, Cloudflare, Azure, etc, you’re probably looking for static rendering + incremental builds. I’d suggest a framework to help with static rendering like Gatsby and Next (React), Gridsum (Vue), or Jekyll, 11ty, and Hugo (static templates) to keep build times and developer overhead to a minimum. Solutions like Next and SvelteKit also offer an escape hatch to move pages of your site out of the static build and into a server-like setup using serverless functions.  So if you have some pages that should vary by-request and won’t benefit from up-front builds, this is a great one-size-fits-all approach.
  • And if your app just isn’t meant for those static builds (and deferred rendering is out of the question), you might wanna skip the Jamstack and use a traditional server. Hosting solutions are still great for this sort of setup (though often more expensive), and efficient caching will give you the snappy performance users demand.

But whichever you choose, never pick up a framework or hosting solution for technology’s sake. Let the end user’s needs guide you to a best fit!

Author

Do you want to share your thoughts with the Global App Development Community? Write for Applozic! Check out how here: https://www.applozic.com/guest-writer/