Marketing | Creative | Strategy

Your faithful partner in the discovery of endless digital opportunity.

Take the First Step

Customize the robots.txt file on your Shopify store

Shopify changed the game for e-commerce with its multifaceted tools for managing and growing a business. As powerful and flexible the platform could be, companies in the past have had issues with parts of the system not being accessible to their end users. Now, Shopify allows you to customize your businesses with the use of the robots.txt file. 

What is the robots.txt file? The robots.txt file is used to inform search engine crawlers and web robots how you want your site to be indexed including which pages can and cannot be indexed. The robots.txt file will not appear anywhere in your theme or inside the Shopify admin area, so you might think it's not possible to customize it. But not only is it possible, it's fairly easy to do.

Here are the steps to customize your robots.txt file:

  • Create a robots.txt file. Select Add a new template, selecting robots and finally create template. This generates the default robots.txt file that looks something like this

Default robots txt file shcreenshot

  • For developers using Theme Kit, another way to do the first step is to create the file directly in your local copy of the theme. Create the new file in the templates directory and name it robots.txt.liquid. You then need to add the default configuration to the file: 

# we use Shopify as our ecommerce platform
{%- comment -%}
# Caution! Please read https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt before proceeding to make changes to this file.
{% endcomment %}
{% for group in robots.default_groups %}
  {{- group.user_agent -}}
  {% for rule in group.rules %}
    {{- rule -}}
  {% endfor %}
  {%- if group.sitemap != blank -%}
    {{ group.sitemap }}
  {%- endif -%}
{% endfor %}

It's important to note that you won’t see different results with the default file set up because this is using the existing default rules that have always been applied to your Shopify store. The powerful part comes after you alter your robots.txt file.

  • Customize your Shopify to stop search engines from indexing the “all” collection page, the search results page, product tag pages and blog tag pages. These are all extraneous content that users will not need to find via search engines. You can do this customization by adding these rules:

{%- if group.user_agent.value == '*' -%}
    {{ 'Disallow: /collections/all*' }}
    {{ 'Disallow: /*?q=*' }}
    {{ 'Disallow: /collections/*/*' }}
    {{ 'Disallow: /blogs/*/tagged/*' }}
  {%- endif -%}

  • The new rules are added below the existing file rules. The final file will look like this:

# we use Shopify as our ecommerce platform
{%- comment -%}
# Caution! Please read https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt before proceeding to make changes to this file.
{% endcomment %}
{% for group in robots.default_groups %}
  {{- group.user_agent -}}
  {% for rule in group.rules %}
    {{- rule -}}
  {% endfor %}

  {%- if group.user_agent.value == '*' -%}
    {{ 'Disallow: /collections/all*' }}
    {{ 'Disallow: /*?q=*' }}
    {{ 'Disallow: /collections/*/*' }}
    {{ 'Disallow: /blogs/*/tagged/*' }}
  {%- endif -%}
  {%- if group.sitemap != blank -%}
    {{ group.sitemap }}
  {%- endif -%}
{% endfor %}

Now that you know the robots.txt trick, your options for customizing your Shopify store are endless. You might want to hide specific products or collections for certain reasons or add special rules required by particular crawlers. It all depends on the needs of your shop. Have fun personalizing your Shopify store using this robots.txt customization method and enjoy the benefits it brings to your growing business.

close[x]

Get A Quote

Ready to take the next step? Please fill out the form below. We can't wait to hear from you!

Once submitted, a Slicedbread representative will be back in touch with you as soon as possible.

* Required Fields