Okay, picture this: you’re running an online store, and you want to make sure your products are easily found on the internet. That’s where the Shopify robots.txt file comes into play. Now, I know it might sound a bit technical, but it’s a big deal when it comes to your site’s SEO (Search Engine Optimization).

E-commerce websites, like the ones you probably manage, tend to be larger and more complex than your average websites. They often have fancy features like faceted navigation, which can make the site even bigger. So, to ensure that Google and other search engines don’t get lost in the labyrinth of your site, you need to control how they crawl it. This control is essential to manage your crawl budget (that’s like your website’s “budget” for how much it gets crawled) and to prevent Google from crawling low-quality pages.

But here’s the catch: if you’re using Shopify for your online store, you might have run into a little snag. SEO experts and Shopify store owners have been longing for the ability to tweak the robots.txt file on the platform. Unlike some other platforms, like Magento, where editing the robots.txt is a breeze, Shopify used to keep this locked down.

The default robots.txt that Shopify provides is pretty good at keeping unwanted web crawlers away, but some sites need more specific rules to fit their unique needs. As more and more e-commerce sites adopt Shopify, they’re getting bigger and more intricate. That’s when the need for tweaking the robots.txt becomes apparent.

The good news is that Shopify has been listening to the SEO community. In June 2021, they made a grand announcement: you can now customize your site’s robots.txt file. This is fantastic news for SEO experts and Shopify store owners who’ve been crossing their fingers for this feature.

Shopify Robots.txt

Now that we know you can edit the file, let’s dive into how to do it and when you might consider making some changes.

What’s the Shopify Robots.txt?

Let’s break it down a bit. The Shopify robots.txt is a file that tells search engines which URLs they’re allowed to crawl on your site. Its main job is to keep the web crawlers away from low-quality pages that you don’t want to show up in search results. This file is created using another file called robots.txt.liquid.

What’s in the Default Shopify Robots.txt?

When you start a new Shopify site, you’ll notice that a robots.txt file is already in place. You can find it by going to:

“`domain.com/robots.txt“`

This default robots.txt file has several pre-configured rules, and most of them are quite handy for keeping search engines from crawling unnecessary pages. Here are some of the essential rules in the default Shopify robots.txt:

– Disallow: /search – This blocks internal site search.

– Disallow: /cart – Stops the Shopping Cart page from getting crawled.

– Disallow: /checkout – Keeps the Checkout page off-limits.

– Disallow: /account – Blocks the account page.

– Disallow: /collections/*+* – Prevents duplicate category pages generated by faceted navigation.

– Sitemap: [Sitemap Links] – Points to the sitemap.xml link.

By and large, Shopify’s default rules do a good job of keeping low-quality web pages away from most sites. In fact, most Shopify store owners probably won’t need to tinker with their robots.txt file at all. The default setup should work for most cases since many Shopify sites are relatively small, and controlling the crawl isn’t a massive issue.

But, as more websites join the Shopify club and these sites grow in size and complexity, default rules might not cover all the bases. In these cases, you might want to create additional rules to tailor the robots.txt to your site. You can do this by editing the robots.txt.liquid file.

How to Create the Shopify Robots.txt.liquid?

Creating your custom Shopify robots.txt.liquid file is a breeze. Just follow these steps in your Shopify store:

1. Go to your Shopify admin page.

2. On the left sidebar, click on Online Store > Themes.

3. Under “Templates,” click the “Add a new template” link.

4. From the dropdown menu, choose “robots.txt.”

5. Create the template.

This will open your Shopify robots.txt.liquid file in the editor. You’re all set to customize it.

How to Edit the Shopify Robots.txt File?

When you want to add or adjust a rule in the Shopify robots.txt file, you can do it by adding some code blocks. Here’s an example of how you can do it:

“`{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: [URLPath]’ }}

{%- endif -%}“`

Let’s say your Shopify site uses “/search-results/” for internal search, and you want to prevent web crawlers from accessing it. You’d add the following command to the file:

“`{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /search-results/.*’ }}

{%- endif -%}“`

If you need to block multiple directories, like “/search-results/” and “/private/,” you’d include two blocks like this:

“`{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /search-results/.*’ }}

{%- endif -%}

{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /private/.*’ }}

{%- endif -%}“`

This will ensure those lines appear in your Shopify robots.txt file.

When Should You Edit Your Shopify Robots.txt File?

So, you’ve got this shiny new power to edit your robots.txt.liquid file, but when should you use it? In most cases, the default Shopify robots.txt should do the trick. However, here are some situations where you might want to roll up your sleeves and make adjustments:

1. Internal Site Search: Blocking your site’s internal search via robots.txt is a good SEO practice. Why? Well, when users start typing all sorts of queries into a search bar, there can be an infinite number of pages generated. If Google starts crawling all these pages, you might end up with a bunch of low-quality search results in the search index. Fortunately, Shopify’s default robots.txt already blocks standard internal search with a rule like “Disallow: /search.” However many Shopify sites use other search technologies that change the URL format. When this happens, you lose that default protection.

2. Faceted Navigations: If your site has faceted navigation (those handy filters on category pages that let users narrow down their choices), you might need to tweak your Shopify robots.txt. The default setup does a good job blocking many pages created by faceted navigation, but it can’t cover every possible scenario. So, if your faceted navigation generates URLs that aren’t blocked by the default rules, it’s time to step in and add some custom rules. This can reduce the crawl of low-quality or similar pages.

3. Sorting Navigation: Many e-commerce sites offer sorting options on their category pages (like sorting products by price, relevance, or alphabetically). These sorted pages often contain duplicate or similar content because they’re just variations of the original category page with the products arranged differently. If you notice these parameterized URLs, like when users sort products alphabetically and you see a URL like “?q=alphabetical,” it’s a sign that you might want to block these URLs to avoid unnecessary crawling.

In a nutshell, Shopify’s robots.txt.liquid file gives SEO pros more control over their site’s crawl process. For most small Shopify stores, the default settings are probably sufficient. However, as your site grows and gets more complex, you might want to fine-tune your robots.txt to suit your unique needs. So, if you’ve got any questions about robots.txt or need help with Shopify SEO, don’t hesitate to reach out! Happy crawling!