After building the main components to my blog, the first thing on my mind was an XML sitemap to maximise the potential of my posts being found.
As my blog was build using Next.js using SSG (Static Site Generation) I immediately set about finding the correct way to handle the XML sitemap generation using SSG.
My initial approach which was based off of a few blog posts I had read, and was to write a XML sitemap by executing a function from within my getStaticProps method like so:
export const getStaticProps: GetStaticProps = async () => { const postData = (await fetchPosts()) as PostsData; generateSitemap(postData.posts); return { props: { Posts: postData.posts }, revalidate: 3600, }; }
The generateSitemap function would use the posts data and write an XML url for each post. I would then stringify this collection and finally write the XML file to the public folder using the FS writeFileSync method.
This all seemed fine at first, I ran the project on dev and checked my sitemap file, all was written correctly. I got the same result after running the npm export command and checking the sitemap file that's generated in the Next.js out folder.
So I was happy all was well and I deployed.
After creating a new blog post I waited some time and found that neither the sitemap or posts where updated, so I checked the Vercel logs and found this error:
ERROR Error: ENOENT: no such file or directory, open 'public/sitemap.xml'
After some searching and not really finding any clear posts I contacted Vercel support and the very helpful support engineer told me: 'Since vercel deployments are immutable, you are not able to write to the filesystem after the build'
So what I was trying to do by executing a function that was writing an XML file to the fileserver on revalidation was not possible.
I needed a different solution.
The simplest solution I came across was to create an XML sitemap using SSR (Server Side Rendering).
SSR would enable me to provide the sitemap on each request, all my posts will still be statically generated and I would still get all the benefits of SSG.
The revalidation would also occur for new blog posts, meaning I did not need to trigger a build for new posts etc, but now my sitemap (only really requested by search engines) would always provide the latest information.
The basic implementation:
const Sitemap: React.FC = () => null; const getPosts = (postSlugs: PostSlug[]): string => { const urls: string[] = []; postSlugs.map((item) => { urls.push( `<url><loc>${process.env.DOMAIN}/post/${item.Slug}</loc><lastmod>${item.SortDate}</lastmod></url>` ); }); return urls.join(''); }; export const getServerSideProps: GetServerSideProps = async ({ res }) => { const postsSlugData = (await fetchPostSlugs()) as PostsSlugData if (res) { res.setHeader('Content-Type', 'text/xml'); res.write(`<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>${process.env.DOMAIN}</loc> <lastmod>${postsSlugData.posts[0].SortDate}</lastmod> </url> ${getPosts(postsSlugData.posts)} </urlset>`); res.end(); } return { props: {}, }; }; export default Sitemap;
My full implementation can be found here on my GitHub
I would love to know if there is a better way so please leave a comment or contact me @DevSabbatical
but at the moment google seems happy after applying the sitemap url to the google search console and my pages are being indexed.