Navigate back to the homepage

Dynamic metatags in your react app

Juan Polanco
August 1st, 2020 · 2 min read

Single page applications and progressive web apps are fastly becoming the global standard for website and app development, they’re fast, reliable, extensible and have an incredible community of developers behind it.

React and the Create react app project are the prefered framework choice for millions of developers all over the world.

One of the biggest drawbacks of single page applications is SEO, if you have only one page you wont be able to render different metatags for your internal routes, there are many solutions like react-helmet that can help you change the metatags dynamically while the user is navigating your app, but this solution wont work for crawlers like googlebot, facebook bot etc.

So, how to add dynamic SEO metatags to your create-react-app?

You could solve this issue by rendering static pages for each route you have or developing your app from the beginning with a framework that can handle this problem like next.js.

But what if you don’t want to configure prerendering for your app and just want the crawlers to read custom metadata depending on the url.

Express.js to the rescue

Express.js is a minimal back end framework for node.js, to install it just run

1npm install express --save

For this solution you will use express to serve your react build, create a server folder in the root of your project and add the index.js file inside it. Then create a simple express server.

This solution would help you to add custom metadata to your react app, however it would do the same for Vue or Angular apps, or for other single page application frameworks

1const express = require('express');
2const app = express();
3
4app.listen(process.env.PORT || 8080);

Now lets make sure that express returns your build for all routes:

1const express = require('express');
2const app = express();
3
4app.get('*', (req, res) => {
5 return res.sendFile('index.html', {root: path.join(__dirname, '../build/')})
6});
7
8app.listen(process.env.PORT || 5000);

Great, we are serving our create-react-app with express.js.

We still need to add custom metadata for the crawlers, for this we need to install a crawler detection tool like es6-crawler-detect

1npm install es6-crawler-detect

Now in every call to a route we can detect if that call is coming from a crawler, the es6-crawler-detect tool uses the user agent header for that

1const express = require('express');
2const app = express();
3
4app.get('*', (req, res) => {
5 const CrawlerDetector = new Crawler(req);
6 const isCrawler = CrawlerDetector.isCrawler(req.get('user-agent'))
7
8 return res.sendFile('index.html', {root: path.join(__dirname, '../build/')})
9});
10
11app.listen(process.env.PORT || 8080);
google bot reviewing your code

Now the fun part, if a crawler is detected we will read the index.html file and make a simple text replacement, for this we need to add placeholders in our index.html metatags:

1<head>
2 <title>__TITLE__</title>
3 <meta name="description" content="__DESCRIPTION__" />
4 <meta property="og:title" content="__TITLE__"/>
5 <meta property="og:description" content="__DESCRIPTION__"/>
6</head>

You’ll be wondering, why not just do the replacement everytime and avoid the crawler detection? the main answer is that usually you will get the metadata from an external API, and such calls can sometimes be slow and impact the loading speed of your app.

The fs package is needed for reading the index.html contents

1const express = require('express');
2const fs = require('fs');
3
4const app = express();
5
6app.get('*', (req, res) => {
7 const CrawlerDetector = new Crawler(req);
8 const isCrawler = CrawlerDetector.isCrawler(req.get('user-agent'));
9
10 if(isCrawler) {
11 const filePath = path.resolve(__dirname, '../build', 'index.html');
12 // Read the index.html content
13 fs.readFile(filePath, 'utf8', (err, data) => {
14 // replace the metatags
15 data = data
16 .replace(/__TITLE__/g, 'My new title')
17 .replace(/__DESCRIPTION__/g, 'My new description');
18
19 return res.send(data);
20 });
21 } else {
22 return res.sendFile('index.html', {root: path.join(__dirname, '../build/')})
23 }
24
25});
26
27app.listen(process.env.PORT || 8080);

We’re replacing metatags for all routes, now we need to add custom data to each route of the project, or at least the routes that need an special SEO treatment, we could do this in many ways:

  1. Create a custom server route for each react route

    1app.get('/home', (req, res) => {
    2 const CrawlerDetector = new Crawler(req);
    3 const isCrawler = CrawlerDetector.isCrawler(req.get('user-agent'));
    4
    5 if(isCrawler) {
    6 const filePath = path.resolve(__dirname, '../build', 'index.html');
    7 // Read the index.html content
    8 fs.readFile(filePath, 'utf8', (err, data) => {
    9 // replace the metatags
    10 data = data
    11 .replace(/__TITLE__/g, 'home title')
    12 .replace(/__DESCRIPTION__/g, 'home description');
    13
    14 return res.send(data);
    15 });
    16 } else {
    17 return res.sendFile('index.html', {root: path.join(__dirname, '../build/')})
    18 }
    19
    20});
  2. Create a SEO dictionary and replace the metatags depending on the route

    1const mySEODictionary = {
    2 home: {
    3 title: 'Homepage',
    4 description: 'Homepage description'
    5 },
    6 about: {
    7 title: 'About page',
    8 description: 'About description'
    9 }
    10}
    11
    12app.get('*', (req, res) => {
    13 const CrawlerDetector = new Crawler(req);
    14 const isCrawler = CrawlerDetector.isCrawler(req.get('user-agent'));
    15
    16 if(isCrawler) {
    17 const filePath = path.resolve(__dirname, '../build', 'index.html');
    18 const paths = req.originalUrl.split('/');
    19
    20 // Read the index.html content
    21 fs.readFile(filePath, 'utf8', (err, data) => {
    22 // replace the metatags
    23 data = data
    24 .replace(/__TITLE__/g, mySEODictionary[paths[0]].title)
    25 .replace(/__DESCRIPTION__/g, mySEODictionary[paths[0]].description);
    26
    27 return res.send(data);
    28 });
    29 } else {
    30 return res.sendFile('index.html', {root: path.join(__dirname, '../build/')})
    31 }
    32
    33});
  3. Most common, get the data from your API server

    Modern web application are usually connected to an external API you can get the metadata from the API and replace the react metatags with it

    1app.get('*', (req, res) => {
    2 const CrawlerDetector = new Crawler(req);
    3 const isCrawler = CrawlerDetector.isCrawler(req.get('user-agent'));
    4
    5 if(isCrawler) {
    6 const filePath = path.resolve(__dirname, '../build', 'index.html');
    7 const paths = req.originalUrl.split('/');
    8
    9 // Read the index.html content
    10 fs.readFile(filePath, 'utf8', (err, data) => {
    11
    12 fetch(`myapi/${ paths[0] }`)
    13 .then(r => r.json())
    14 .then(res => {
    15 // replace the metatags
    16 data = data
    17 .replace(/__TITLE__/g, res.title)
    18 .replace(/__DESCRIPTION__/g, res.description);
    19
    20 return res.send(data);
    21 })
    22 });
    23 } else {
    24 return res.sendFile('index.html', {root: path.join(__dirname, '../build/')})
    25 }
    26});

That’s it! you could tweak the code to fit your app, there are many ways to solve this problem but I’ve found this solution to be the fastest and easiest to implement, I really hope it could help you save hours of research.


I crafted the googlebot for this article with polymer clay, you can see the process here

Thanks for reading! 🙏

More articles from Code and Clay

Faking mouse events with JS to win at basketball

Faking mouse events with JS to win at basketball

June 20th, 2018 · 1 min read

Vue.js Step sequencer

Vue.js Step sequencer

February 21st, 2018 · 1 min read
© 2017–2020 Code and Clay
Link to $https://twitter.com/jspolancorLink to $https://github.com/jspolancorLink to $https://instagram.com/jspolancorLink to $https://www.linkedin.com/in/jspolancor/