Tuesday, January 2 2024

SEO Optimization for Client Side Rendered Next.js Apps

Rishi Raj Jain
Rishi Raj Jain @rishi_raj_jain_

When I was designing and building at wellowise a Client Side Rendered Angular 7 app, the biggest concern was Search Engine Optimization for the business. Why? Because it helps crawler bots such as Google to know the content and index the page, and show rich previews when sharing the page link on Social Media (such as WhatsApp, Telegram, etc.). Similarly,

Client Side Rendered Next.js apps lack Search Engine Optimization and Rich Previews

So I set out for solving the case of dynamic rewrites with Vercel and Next.js, where I planned to identify the request of crawler bots and rewrite a given URL which takes care of creating rich previews and emitting out HTML containing server-side content, if bots are crawling any page that starts with /posts/ in my app. Let's get started.

Configuring Regex for identify crawler bot request

Here's a regex that one can use to identify requests with User-Agent header of a crawler bot.

// File: next.config.js

const allowedBots = '.*(bot|telegram|baidu|bing|yandex|iframely|whatsapp|facebook).*';

Configuring Next.js Config for Detecting Crawler Bots

Update the next.config.* of your app to contain the following condition for a rewrite:

// File: next.config.js

/** @type {import('next').NextConfig} */
const nextConfig = {
  // ...
  async rewrites() {
    return [
      // ...
      {
        // From where the bots would be redirected to crawl the page
        destination: 'https://my-og.url/posts/:match*',

        // The condition which redirects the bots to the destination above
        has: [{ key: 'user-agent', type: 'header', value: allowedBots }],

        // The path regex where the condition above is ran for
        source: '/posts/:match*'
      }
    ];
  },
}

module.exports = nextConfig;

and we're done! With this set up in place, each request that'd come to your app's /posts/:slug pages from the crawler bots, the app would proxy and serve the content from the url: https://my-og.url/posts/:slug.

Configuring Upstream OG URL

By now, we've configured crawler bot requests to https://my-og.url/posts/:slug. The next step in creating social preview is to configure the upstream-og.url to dynamically send out all SEO Meta Tags required for a great social preview when the link is shared and informational content for the crawlers to index.

Example

https://github.com/heyxyz/hey/blob/main/apps/web/next.config.js#L65-L73

Write a comment

Email will remain confidential.