Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Thousands of 503 errors in GSC for pages not important to organic search - Is this a problem?
-
Hi, folks
A client of mine now has roughly 30 000 503-errors (found in the crawl error section of GSC). This is mostly pages with limited offers and deals. The 503 error seems to occur when the offers expire, and when the page is of no use anymore. These pages are not important for organic search, but gets traffic from direct and newsletters, mostly.
My question:
Does having a high number of 503 pages reported in GSC constitute a problem in terms of organic ranking for the domain and the category and product pages (the pages that I want to rank for organically)?If it does, what is the best course of action to mitigate the problem?
Looking excitingly forward to your answers to this
- Sigurd
-
Hi, Andy. Thank you so much for the insights
-
Yeah, I'd have to say that a 404 would be far preferable. A 301 would be ideal, but it would take some bandwidth to redirect to the next-most-relevant page whenever a deal expires.
-
Hi Sigurd,
A 503 seems like an odd choice in this circumstance - is this something that the eCommerce software dictates, or are there options for this?
A 503 is service unavailable and isn't something I would expect to see on a page that has expired. There are a number of alternatives depending on the product & niche. For Google to be seeing thousands of 503's might make it look like the site is having problems.
If a page really must go, then it should be a 404 / 410, or a 301 to a close alternative, but not a 503.
Are there pointers that show these are causing issues with the site's SERP's?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Why does Google rank a product page rather than a category page?
Hi, everybody In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag.. What could be the critical factors that makes the product page more relevant than the category page as the landing page?
Intermediate & Advanced SEO | | Inevo0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
Redirect Search Results to Category Pages
I am planning redirect the search results to it's matching category page to avoid having two indexed pages of essentially the same content. Example http://www.example.com/search/?kw=sunglasses
Intermediate & Advanced SEO | | WizardOfMoz
wil be redirected to
http://www.example.com/category/sunglasses/ Is this a good idea? What are the possible negative effect if I go this route? Thanks.0 -
Organic search traffic dropped 40% - what am I missing?
Have a client (ecommerce site with 1,000+ pages) who recently switched to OpenCart from another cart. Their organic search traffic (from Google, Yahoo, and Bing) dropped roughly 40%. Unfortunately, we weren't involved with the site before, so we can only rely on the wayback machine to compare previous to present. I've checked all the common causes of traffic drops and so far I mostly know what's probably not causing the issue. Any suggestions? Some URLs are the same and the rest 301 redirect (note that many of the pages were 404 until a couple weeks after the switch when the client implemented more 301 redirects) They've got an XML sitemap and are well-indexed. The traffic drops hit pretty much across the site, they are not specific to a few pages. The traffic drops are not specific to any one country or language. Traffic drops hit mobile, tablet, and desktop I've done a full site crawl, only 1 404 page and no other significant issues. Site crawl didn't find any pages blocked by nofollow, no index, robots.txt Canonical URLs are good Site has about 20K pages indexed They have some bad backlinks, but I don't think it's backlink-related because Google, Yahoo, and Bing have all dropped. I'm comparing on-page optimization for select pages before and after, and not finding a lot of differences. It does appear that they implemented Schema.org when they launched the new site. Page load speed is good I feel there must be a pretty basic issue here for Google, Yahoo, and Bing to all drop off, but so far I haven't found it. What am I missing?
Intermediate & Advanced SEO | | AdamThompson0 -
How important are sitemap errors?
If there aren't any crawling / indexing issues with your site, how important do thing sitemap errors are? Do you work to always fix all errors? I know here: http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholds Duane Forrester mentions that sites with many 302's 301's will be punished--does any one know Googe's take on this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0