You have a single page application and want to fix SEO problem quickly then you came to the right place! This post is a complete guide to prerendering solution!
Client-side rendering (CSR) has excellent developer experience when you have to deal with frontend technologies only and consume separated APIs from somewhere else. But it has a known critical problem of not performing well on search engines and social sharing. Those sites that have dynamic SEO-enabled content must fix this problem asap.
The problem might be solved completely by converting into server-side rendering (SSR) but it is an expensive solution — you have to change a significant part of your frontend codebase, replace client-side only 3rd-party packages, and take ages to fully implement SSR properly.
Universal rendering (via rehydration) is quite similar to SSR when it can solve the problem completely but too heavy to change.
You can think of using static rendering, but this is not a choice for dynamic content.
Prerendering comes to rescue! Prerendering is a technique that you use a renderer to pre-render your CSR app and response ready-to-rendered viewable HTML output to page requests. The prerendering process can happen at build time, runtime or on schedule using local tools or remote services.
The idea behind prerendering is simple, you force your CSR app behave like a static rendering app by using a renderer (often a headless browser) to render requested pages ahead of time and response with static HTML.
Prerendering at build time — similar to static site generators, will not work with dynamic content, can be used to boost time to contentful paint of your CSR apps.
Prerendering on the fly — you use a service to intercept requests, output ready-to-render HTML, cache the HTML, and finally response to client. It works well with dynamic content, can detect crawler user agents and serve differently.
Prerendering on schedule — you schedule the renderer to pre-render all pages at your preferred interval, highly dynamic fresh content needs small interval. It performs similar to on-the-fly but very memory consuming or using a non-aggressive caching strategy.
You can adopt to prerendering fully for all routes and all user agents, dynamically per user agent (requests from crawlers are routed to a renderer, requests from users are served normally), and dynamically on a per-page basis.
Setting up prerendering is simple and it allows you to keep your frontend untouched as a static site without the need for a node server.
Load time will be slowed down if you’re using on-demand prerendering, and expensive memory consuming if you use very aggressive cached pre-rendered pages.
Prerendering is a quick but not complete rendering strategy, it not as fast as static rendering, not able to handle dynamic content as well as SSR.
Data that you are fetching from an API should not be prerendered as this data is dynamic and is going to change.
The ugly point of this prerendering is it’s working just like a crawler, it does have problem of knowing when the lazy-loaded content is finished.
Consider prerendering when you’re adding SEO and dynamic content to your application and it’s not feasible given your current architecture to start server-side rendering certain routes in your app.
When your application architecture has already been put together and a considerable amount of work has been done, if you need to make SEO improvements, it’s a good idea to use a prerendering service.
When the content lives behind a login screen, prerendering becomes unnecessary, since bots won’t ever make it through the first login screen.
Any single page application using webpack can be pre-rendered using the prerender-spa-plugin.
You can install and configure a dynamic renderer to transform your content into static HTML that’s easier for crawlers to consume. Some common dynamic renderers are Puppeteer, Rendertron, and prerender.
Those services allow you to install some middleware on your server (Apache, Node.js, Nginx, etc) or configure it in your cloud setup (S3 is another valid use case for serving files statically) that checks each request to see if it’s from a web crawler.
Prerender.io is a dedicated prerendering service and is also available as open-source, it scrapes your website on a regular basis using the latest Chrome, then it stores all the rendered HTML pages into a database and gives you an API for that so you can access the rendered HTML for every URL of your website.