Crawl errors are one of the most common technical SEO issues on Shopify stores, yet they are often ignored because they happen silently. Google encounters an error, moves on to the next URL, and you never know the page was not indexed. Over time, unresolved crawl errors accumulate, wasting crawl budget, losing link equity, and preventing important pages from appearing in search results.

This guide provides a systematic process for finding and fixing every type of crawl error on Shopify. It complements our technical SEO for Shopify guide and is part of what we deliver in every SEO engagement.

Types of crawl errors on Shopify

Understanding the different types of crawl errors helps you prioritise fixes effectively.

404 errors (Page Not Found)

The most common crawl error. A 404 occurs when Googlebot requests a URL that does not exist. On Shopify, these typically arise from deleted products, changed URL handles, removed blog posts, or old external links pointing to URLs that no longer exist.

Redirect errors

Redirect chains (A → B → C), redirect loops (A → B → A), and redirects to 404 pages all create crawl errors. These are common after platform migrations or when URL handles are changed multiple times.

Soft 404 errors

A soft 404 occurs when a page returns a 200 (OK) status code but contains no meaningful content. Google detects that the page is effectively empty and reports it as a soft 404. On Shopify, these often occur on empty collection pages, out-of-stock product pages with no content, or search results pages with zero results.

Server errors (5xx)

Server errors indicate a problem on Shopify’s end. These are rare on Shopify’s hosted infrastructure but can occur during platform incidents or when theme code causes Liquid rendering errors.

Blocked by robots.txt

Some URLs may be blocked from crawling by the robots.txt file. Shopify’s default robots.txt blocks several URL patterns (admin, cart, checkout, search), but custom additions or app modifications can accidentally block important pages.

Google Search Console Pages report showing different types of crawl and indexing errors
Google Search Console categorises crawl and indexing issues by type, helping you prioritise fixes.

Step 1: Find crawl errors using Search Console

Google Search Console is your primary tool for identifying crawl errors.

Check the Pages report

Navigate to Pages (formerly Coverage) in Google Search Console. This report shows every URL Google has attempted to crawl, categorised by status. Pay attention to the “Not indexed” section, which lists reasons why pages are excluded from the index.

Key statuses to investigate

  • Not found (404): Pages that returned a 404 error
  • Redirect error: Pages with redirect loops, chains, or redirects to errors
  • Soft 404: Pages that return 200 but have no content
  • Server error (5xx): Pages that returned server errors
  • Blocked by robots.txt: Pages blocked from crawling
  • Crawled, currently not indexed: Pages crawled but deemed not valuable enough to index

Run a full site crawl

Search Console only shows issues for URLs Google has attempted to crawl. To find additional issues, run a full site crawl using Screaming Frog or Sitebulb. This reveals internal linking issues, orphaned pages, and problems that Google has not yet reported.

Step 2: Fix 404 errors

Not all 404 errors need fixing, but important ones must be addressed promptly.

Prioritise by importance

Check each 404 URL for backlinks (using Ahrefs or SEMrush) and previous traffic (using Google Analytics). 404 pages with backlinks or that previously ranked for keywords should be redirected immediately. 404 pages with no backlinks and no previous value can be safely ignored.

Redirect deleted products

When you delete a product on Shopify, its URL returns a 404. If the product had backlinks or traffic, redirect it to the most relevant collection page or a similar product. In Shopify, go to Online Store > Navigation > URL Redirects to create the redirect.

Fix internal links

If internal links on your site point to 404 URLs, update them to point to the correct pages. Use a crawl tool to find all internal links to 404 pages and fix them in your theme code, navigation, or content. For a deeper dive into handling deleted content, see our Shopify agency guide.

Workflow for prioritising and fixing 404 errors based on backlinks and traffic
Prioritise 404 fixes by checking for backlinks and previous traffic. High-value 404s need immediate redirects.

Step 3: Fix redirect errors

Redirect errors waste crawl budget and lose link equity at each hop in a chain.

Fix redirect chains

A redirect chain occurs when URL A redirects to B, which redirects to C. Each hop loses a small amount of link equity. Update the redirect so A goes directly to C. In Shopify, delete the intermediate redirect and create a direct redirect from the original URL to the final destination.

Fix redirect loops

A redirect loop occurs when A redirects to B and B redirects back to A. This creates an infinite loop. Identify which URL should be the final destination and delete the other redirect.

Fix redirects to 404s

If a redirect points to a URL that no longer exists, update it to point to the correct current page. This is common after multiple rounds of URL changes.

Step 4: Fix soft 404 errors

Soft 404s are trickier because the pages technically work but have no useful content.

Empty collection pages

Collection pages with zero products return a 200 status but show no content. Either add products to the collection, redirect it to a related collection, or add a noindex meta tag until you have products to display.

Out-of-stock product pages

Product pages for permanently discontinued items can become soft 404s if stripped of all content. If the product will return, keep it with a “currently unavailable” message and back-in-stock notification. If permanently discontinued, redirect to the relevant collection or a replacement product.

Search results pages

Shopify’s internal search creates URLs like /search?q=term. When these pages have zero results, they are soft 404s. Shopify’s default robots.txt blocks /search from crawling, but verify this is still in place on your store. This connects to our enterprise Shopify guidance.

Examples of soft 404 pages on Shopify: empty collection, out-of-stock product, zero-result search
Soft 404s look like real pages to users but contain no useful content for Google to index.

Step 5: Review robots.txt and blocked resources

Shopify generates a robots.txt file that can be customised through the robots.txt.liquid template.

Check default blocked paths

Shopify’s default robots.txt blocks admin, cart, checkout, and search paths. Verify these are still in place and that no custom additions have accidentally blocked important pages.

Check for app-added rules

Some apps modify the robots.txt file. Review it for any unexpected disallow rules that might be blocking important content.

Test with Search Console

Use the URL Inspection tool in Search Console to test whether specific URLs are blocked by robots.txt. If a page you want indexed is blocked, update the robots.txt template to allow it.

Step 6: Optimise crawl budget

For larger Shopify stores, crawl budget optimisation ensures Google crawls your most important pages regularly.

Reduce duplicate URLs

Duplicate content creates unnecessary URLs for Google to crawl. Implement proper canonical tags, noindex tag-filtered URLs, and update internal links to use canonical URLs. See our duplicate content guide for details.

Submit an XML sitemap

Shopify generates an XML sitemap automatically at /sitemap.xml. Submit it in Google Search Console to help Google discover all your important pages. Verify the sitemap includes all products, collections, and blog posts you want indexed.

Improve page load speed

Faster pages get crawled more efficiently. Google can crawl more pages per second when each page loads quickly. Speed improvements benefit both crawl efficiency and rankings.

Step 7: Set up ongoing monitoring

Crawl errors recur as products are added, deleted, and modified. Ongoing monitoring catches new issues before they impact SEO.

Weekly Search Console checks

Check the Pages report weekly for new errors. Set up email notifications in Search Console to receive alerts when critical issues are detected.

Monthly site crawls

Run a full site crawl monthly and compare results against the previous month. Look for new 404s, new redirect chains, and new indexing issues.

Track crawl stats

In Search Console, check Settings > Crawl stats. This shows how many pages Google crawls per day, the average response time, and any crawl errors encountered. A sudden drop in crawl rate or spike in errors indicates a problem that needs immediate attention.

Google Search Console crawl stats showing daily crawl requests, download size, and response time
Crawl stats in Search Console reveal how efficiently Googlebot can access your Shopify store.

Crawl errors are the silent killers of ecommerce SEO. They do not cause dramatic ranking drops — they cause gradual erosion as pages fall out of the index and link equity leaks through broken URLs. Systematic monitoring and prompt fixes prevent this slow decline.

Andrew Simpson, Founder

Bringing it together

Finding and fixing crawl errors on Shopify is a systematic process: identify errors in Search Console, prioritise 404s by importance, fix redirect issues, address soft 404s, review robots.txt, optimise crawl budget, and set up ongoing monitoring. The stores that maintain clean crawl profiles are the ones that treat this as an ongoing maintenance task, not a one-off project.

If your Shopify store has accumulated crawl errors and you need help cleaning them up, get in touch. We include crawl error auditing and resolution as part of our technical SEO services.