How to modify the Shopify Robots.txt (robots.txt.liquid)


Technical SEO
Staff member
Shopify recently rolled out the functionality to modify the robots.txt file. This is fantastic news for SEO and developers alike working with Shopify shops.

If you have been working with Shopify eCommerce storefronts in the last few years, you would have been acutely aware of not being able to modify the robots.txt file. That is unless you have been using workarounds such as service workers.

This week across social media channels I saw a couple of mentions that Shopify has quietly rolled out the functionality to edit/modify the robots.txt. Here is the article that I saw being circulated -

Which led me to check if this was true and secondly to conduct some tests. I can say, yes it is true and yes, you can edit the Shopify robots.txt file. In this article, I will show you how to create a robots.txt.liquid template file and how you can modify the file to includes your directives. For this guide, I spun up one of those free 2 weeks trial demos using the default debut theme.

Create the robots.txt.liquid template file

As this functionality is brand new (June 2021), Shopify by default has not made the robots.txt liquid template file readily available. You have to create the file yourself. To do this, I created a new template article file and named it robots.txt.

This will create a default article template. The next step is to clear out all the article liquid markup code so that it is a blank file. Now to rename the file from article.robots.txt.liquid to robots.txt.liquid

Steps to follow
1, create a new article template file, save the file as robots.txt
2, clear out all of the default article liquid markup code
3, copy/paste in the robots.txt boilerplate
4, save the file
5, rename the file from article.robots.txt.liquid to robots.txt.liquid


Copy/paste the boilerplate liquid markup shown below into the file. Before you save the file, remove or modify the test directives and test sitemap location with whatever is pertinent to you for optimising the robots.txt file.

Robots.txt Boilerplate
Code below is the robots.txt.liquid boilerplate (with comments and test directives/sitemap)
{% for group in robots.default_groups %}
  {{- group.user_agent }}

{%- for rule in group.rules -%}
    {{ rule }}
{%- endfor -%}

{%- if group.user_agent.value == '*' -%}

{%- comment -%}
Add your robots.txt directives below.
{%- endcomment -%}

{{ 'Disallow: /testing/123*' }}
{{ 'Disallow: /sitebee*' }}
{%- endif -%}

{%- if group.sitemap != blank -%}
      {{ group.sitemap }}

{%- comment -%}
Add your custom sitemap path below.
{%- endcomment -%}

{{ 'sitemap:' }}
{%- endif -%}

{% endfor %}

Default Shopify Robots.txt file with amends


The file that we have just created is the standard default robots.txt file with two additional custom disallow (test) directives and a custom sitemap location declaration. My recommendation is for you to remove them and replace them with your custom directives.

And that wraps up this quick tutorial on how to edit the robots.txt file for Shopify websites.
Last edited: