[Vuejs]-Why are my Nuxt/vue pages getting blocked by robot.txt?

0👍

By default Nuxt allows each route without or with a trailing slash, example:

It can be detect as duplicate content by crawlers.
So you can define which is the main URL with the "canonical" header.

But if you want keep URLs only with a trailing slash at end, you have to allow only route with a trailing slash by the router configuration:

// nuxt.config.js

router: {
  trailingSlash: true
}

See docs https://nuxtjs.org/docs/2.x/configuration-glossary/configuration-router#trailingslash


in addition,
you don’t need to hardcoded all your routes in the sitemap-module config, it’s automatic for all static routes, eg.:

// nuxt.config.js

sitemap: {
  hostname: 'https://northarc.dk',
  defaults: {
    changefreq: 'monthly',
    priority: 1,
    trailingSlash: true
  },
  exclude: ['roi', 'pricing'],
  trailingSlash: true // if necessary
},

Leave a comment