How to add Sitemap to Robots.txt File

How to add sitemap to robots.txt file and why it's important to add sitemap to robots.txt file. Adding sitemaps into robots.txt enhances the visibility and ranking of your website in search engine results.

How to add Sitemap to Robots.txt File

If you are a blogger or web developer then you must have heard of two files: "Robots.txt" and "XML Sitemap". These two files are required to index your site on Google. All of your website pages will be crawled and indexed by Google search engine bots.

In this article, we will show you how to add sitemaps to your robots.txt file.

Robots.txt

Robots.txt is a text file containing sets of instructions for search engine bots or web crawlers that crawl your websites. Web crawlers are responsible for indexing the content of your website or blog so that it can appear in the search engine results.

With a robots.txt file, you can specify which parts of the website or blog should or should not be crawled by search engine bots. You can also block specific pages or duplicate pages or resources from crawlers. 

so it's mainly used to control the behavior of web crawlers on your website or blog.

Usually robots.txt file is located in the root directory or website or blog and can be accessed by "https://example.com/robots.txt"

A robots.txt file looks like this:

 

XML Sitemap(Sitemap.xml)

XML Sitemap or sitemap.xml is a file that contains information about the structure and organization of the website's content. It lists all the URLs on the website or blog and it also contains additional metadata like modified date and frequency of updates.

Sitemap.xml looks like this:

How to add a sitemap to robots.txt?

You can add the sitemap to robots.txt as follows:

User-agent: *
Disallow: /cdn-cgi/
Disallow: /_next/
Sitemap: https://blogsy.co/sitemap.xml

As you can see in the last line in the above code, By using the Sitemap directive we can specify the sitemap URL of website or blog in the robots.txt file

This directive tells the search engines like Google, Bing, and Yandex where to find the XML sitemap of the website.

Why adding a sitemap to robots.txt is important?

We just see how easy and quickly we can add a sitemap to the robots.txt file but alternatively, if you don't want to add a sitemap to robots.txt then you have to submit your XML sitemap to each search engine using webmaster tools which will be a time-consuming task.

So I think just adding a sitemap to robots.txt will get this job done quickly and you don't have to submit to each webmaster tool. because robots.txt is the entry point for all the crawlers and crawlers will find out sitemap URL from the robots.txt file and will index those files in search engine results.

When you update your website and sitemap, search engines can quickly identify changes by referring to the sitemap. This can lead to faster updates in search results when you make changes to your content.

How to add a sitemap to Robots.txt using Blogsy?

Navigate to Settings and then Sitemap Setting. There you can see the textare for Robots.txt and as you can see in the below image you can add Sitemap in robots.tsx and then click on Save Changes. and also don't forget to deploy the changes by clicking the Deploy Site button from the nav bar.

Blogsy is a super-fast blogging platform which generate SEO-ready blogs effortlessly, without the need for setup or maintenance.

How to create an SEO-optimized super fast blog with Blogsy - No coding required

Conclusion

Adding a sitemap to your robots.txt file is a crucial SEO practice. By including the "Sitemap" directive in your robots.txt, search engines or web crawlers will find the sitemap.xml file path. And all your web pages will be indexed. Adding sitemaps into robots.txt enhances the visibility and ranking of your website in search engine results.

Need a SEO-ready lightning-fast blog?

Blogsy can create a SEO-ready blazing fast blog that keeps visitors engaged and satisfied. No coding or server setup required.

Submitted successfully.