
In Magento 2, the robots.txt file plays a critical role in how search engines crawl and index your store. By customizing it, you can guide Google and other crawlers toward your most valuable pages while keeping sensitive or low-value sections hidden.
This guide explains what the robots.txt file is, how Magento 2 handles it, and how you can configure it step-by-step. You’ll also find best-practice examples to ensure your store’s SEO is optimized.
What is Magento 2 Robots.txt File?
The Magento 2 robots.txt file is essentially a set of instructions for search engines like Google, telling them which pages on your website to look at and which to ignore. This is crucial for SEO because it helps keep private or unimportant pages out of search results, while guiding search engines to your important product pages. Instead of editing a file directly, you manage these rules from the Magento Admin panel, and the system automatically creates the robots.txt file for you.
How Magento 2 Handles Robots.txt
Instead of using a physical file, Magento 2 generates its robots.txt content automatically based on settings you configure in the Admin panel. When a search engine requests the file, Magento combines its default settings with any custom rules you’ve added. While you can manually place a robots.txt file in your pub/ folder to override this system, it is generally better to use the built-in Admin settings.
Default Behavior: Magento 2 default robots.txt file is open, allowing search engines to crawl the entire site. Magento automatically generates robots.txt content unless you override it manually. This makes it crucial for you to customize these settings and add your own rules to block sensitive areas like the admin panel, customer accounts, the checkout process, and internal search results from appearing in search engines.
If you’re unsure which rules to add or want a custom setup tailored to your store — our Magento SEO experts can help.
📩 Contact us today to get personalized assistance.
How to Configure and Edit Magento 2 Robots.txt
Magento 2 makes it easy to edit robots.txt through the Admin Panel:
How to Configure Magento 2 Robots.txt:
- Go to Content > Design > Configuration.
- Select your website or store view and click Edit.
- Then expand Search Engine Robots.
- In the Default Robots dropdown, choose the appropriate option:
- INDEX, FOLLOW: Allow pages to be indexed and links followed
- NOINDEX, FOLLOW: Hide pages from search results but allow crawling links
- INDEX, NOFOLLOW: Show pages in search results but prevent crawling links
- NOINDEX, NOFOLLOW: Block both indexing and crawling
- In the Edit Custom Instruction of Robots.txt File field, you can add custom robots.txt rules.
- Use the Reset to Default button if you want to restore Magento’s default robots.txt instructions.
- Click Save Configuration.
- Flush the Magento cache to apply your changes.

2 Methods to Add a Sitemap to Magento Robots.txt
Including your XML sitemap in robots.txt helps search engines discover and crawl your content more efficiently. Magento supports both manual and automatic methods.
Automatic Method
How to Enable Automatic Sitemap Submission:
- In the Admin panel, go to Stores → Configuration → Catalog → XML Sitemap.
- Expand the Search Engine Submission Settings section.
- Set Enable Submission to Robots.txt to Yes.
- Click Save Configuration.

Magento will automatically append the sitemap URL configured in your XML Sitemap settings to the robots.txt file. If you have multiple sitemaps or want a custom URL format, you can use the manual method via the Design Configuration section.
Manual Method
How to Add Your Sitemap to Robots.txt Manually
- Go to Content → Design → Configuration → Edit → Search Engine Robots.
- In Edit Custom Instruction of Robots.txt File, add:
Sitemap: https://your-domain.com/sitemap.xml
- Click Save Configuration and flush cache.
- Verify the sitemap entry in
robots.txt
.
Tip: If you have multiple sitemaps (e.g., for products, categories, or CMS pages), you can list each one on its own line.
Practical Robots.txt Instructions for Magento 2 Store
Every Magento 2 store has different needs, but there are common rules that almost every site should include. A robots.txt file usually contains:
- Allowed resources – CSS, JavaScript, and media files that crawlers need for proper rendering.
- Disallowed duplicate URLs – filter parameters, internal search results, and session IDs that create crawl traps.
- Restricted areas – checkout, customer accounts, and admin paths that should never appear in search engines.
- Sensitive files – system files like cron.php or index.php that must be blocked for security.
- Sitemap references – to guide search engines toward your important pages.
Below is a sample Magento 2 robots.txt template that combines these essentials:
# Basic Magento 2 robots.txt
User-agent: *
# Allow essential resources
Allow: /media/
Allow: /static/
Allow: /*.css$
Allow: /*.js$
# Block duplicate URLs & search results
Disallow: /*SID=
Disallow: /*?dir=
Disallow: /*?mode=
Disallow: /*?limit=
Disallow: /*?order=
Disallow: /*?q=
Disallow: /catalogsearch/
# Block customer & checkout paths
Disallow: /checkout/
Disallow: /customer/
Disallow: /cart/
Disallow: /sales/guest/
# Block system folders & sensitive files
Disallow: /app/
Disallow: /bin/
Disallow: /var/
Disallow: /downloader/
Disallow: /index.php
Disallow: /*.php$
# Sitemap
Sitemap: https://www.example.com/sitemap.xml
How to Use Robots.txt in Magento 2 for Better SEO
A robots.txt file isn’t just about blocking pages — it’s about guiding search engines strategically. Here’s why each rule matters and how to monitor its impact:
1. Block Low-Value or Duplicate Content: Filters, internal search results, and session IDs waste crawl budget and may cause duplicate content. Blocking them ensures Google spends time on your valuable product and category pages.
2. Allow Priority Pages: Your homepage, categories, and products should remain crawlable. The default INDEX, FOLLOW setting makes sure these are indexed.
3. Manage URL Parameters: Unnecessary parameters (?dir=, ?mode=, ?order=) create endless variations of the same content. Block them to avoid crawl traps. Some SEOs block all parameters with Disallow: /*? and then whitelist pagination (?p=) — adapt based on your store.
4. Test & Monitor Regularly:
- Use Google Search Console’s Robots.txt Tester to verify your file.
- Check Coverage reports for blocked pages that should be indexed.
- Review Crawl stats to ensure bots focus on your most important pages.
Summary
Configuring Magento 2’s robots.txt is a crucial SEO and site-governance task. It’s done in the Admin panel, not by editing files. Magento generates robots.txt file from your settings, so use the Default Robots option plus custom Disallow/Allow rules to control crawler access.
As best practices, block low-value pages (search, checkout, etc.), include your XML sitemap, and never list truly sensitive URLs like your admin path. Whether launching a new store or improving an old one, a well-crafted robots.txt will help search engines index the right pages and protect site performance and security.
Get in touch with us and ensure your Magento 2 robots.txt is fully optimized for search engines.
Frequently Asked Questions (FAQ)
What is the default robots.txt in Magento 2?
The Magento 2 default robots.txt file is open (INDEX, FOLLOW), meaning search engines can crawl the entire site. It’s strongly recommended to add custom rules to block checkout, customer pages, and other sensitive or duplicate content.
How do I add my sitemap to robots.txt in Magento 2?
You can manually add your sitemap like this:
Sitemap: https://yourdomain.com/sitemap.xml
Or, in Magento Admin, enable automatic sitemap submission under:
Stores > Configuration > Catalog > XML Sitemap > Enable Submission to Robots.txt
Should I block layered navigation or filter pages?
Yes, in most cases. Layered navigation parameters (e.g., ?color=, ?price=) create duplicate content and waste crawl budget. Disallow these parameters in robots.txt, while keeping core product and category pages crawlable.