Is there a limit to the number of duplicate pages pointing to a rel='canonical ' primary?
-
We have a situation on twiends where a number of our 'dead' user pages have generated links for us over the years. Our options are to 404 them, 301 them to the home page, or just serve back the home page with a canonical tag.
We've been 404'ing them for years, but i understand that we lose all the link juice from doing this. Correct me if I'm wrong?
Our next plan would be to 301 them to the home page. Probably the best solution but our concern is if a user page is only temporarily down (under review, etc) it could be permanently removed from the index, or at least cached for a very long time.
A final plan is to just serve back the home page on the old URL, with a canonical tag pointing to the home page URL. This is quick, retains most of the link juice, and allows the URL to become active again in future. The problem is that there could be 100,000's of these.
Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?)
Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up?
Thanks
-
I'll add this article by Rand that I came across too. I'm busy testing the solution presented in it:
https://moz.com/blog/are-404-pages-always-bad-for-seo
In summary, 404 all dead pages with a good custom 404 page so as to not waste crawl bandwidth. Then selectively 301 those dead pages that have accrued some good link value.
Thanks Donna/Tammy for pointing me in this direction..
-
In this scenario yes, a customized 404 page with a link to a few top level ( useful) links would be better served to both the user and to Google. From a strictly SEO standpoint, 100,000 redirects and or canonical tags would not benefit your SEO.
-
Thanks Donna, good points..
We return a hard 404, so it's treated correctly by google. We are just looking at this from a SEO point of view now to see if there's any way to reclaim this lost link juice.
Your point about looking at the value of those incoming links is a good one. I suppose it's not worth making google crawl 100,000 more pages for the sake of a few links. We've just starting seeing these pop up in Moz Analytics as link opportunities, and we can see them as 404's in site explorer too. There are a few hundred of these incoming links that point to a 404, so we feel this could have an impact.
I suppose we could selectively 301 any higher value links to the home page.. It will be an administrative nightmare, but doable..
How do others tackle this problem. Does everyone just hard 404 a page when that loses the link juice for incoming links to it..?
Thanks
-
Hi David,
When you say "we've been 404'ing them for years", does that mean you've created a custom 404 page that explains the situation to site visitors or does it mean you've been letting them naturally error and return the appropriate 404 (page not found) error to Google? It makes a difference. If the pages truly no longer exist and there is no equivalent replacement, you should be letting them naturally error (return a 404 return code) so as not to mislead Google's robots and site visitors.
Have you looked at the value of those incoming links? They may be low value anyway. There may be more valuable things you could be doing with your time and budget.
To answer your specific questions:
_Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?) _
Yes, if those pages (or valuable replacements) don't actually exist. You'd be wasting valuable crawl budget. This looks like it might be especially true in your case given the size of your site. Check out this article. I think you might find it very helpful. It's an explanation of soft 404 errors and what you should do about them.
Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up?
If the canonical tag is changed or removed, Google will find and reindex it next time it crawls your site (assuming you don't run out of crawl budget). You don't need to use WMT unless you're impatient and want to try to speed the process up.
-
Thanks Sandi, I did..
It's a great article and it answered many questions for me, but i couldn't really get clarity on my last two questions above..
-
Hey David
Check this MOZ Blog post about Rel=Canlonical appropriately named Rel=Confused?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reducing multi-page website to one page & SEO ramifications?
Hello there! I just want to check in before I do this. I am reducing a multi-page website to one page (temporarily, but for at least 4-6 months). I will be 301 redirecting all old pages to the one, new home page. The new home page has a lot more content, long and short keyword phrases. Aside from losing the benefit of internal links, will reducing the number of website pages hurt a ranking? Does having associated keywords on other website pages provide benefit to another (in this case Home) page? Thanks so much for your invaluable advice!
On-Page Optimization | | lulu710 -
Use of '&' in meta title
Hi, I know that use of '&' would be helpful to save space and also add more keyword variation to the title tag. But just want to make sure if it matters if I use '&' in most of my title tags? And also is it common to use more than & in one title? Would the following title be different in Google's perspective regardless of the title length? I am thinking they are all targeting the keywords 'fruit cake' and 'fruit bread', but the first one is the best. buy fruit cake & bread buy fruit cake & fruit bread buy fruit cake and fruit bread Thanks in advance.
On-Page Optimization | | russellbrown0 -
Is it better to create more pages of content or expand on current pages of content?
I am assuming that one way of improving the rankings of current pages will be to create more content on the keywords used... should this be an expansion of the content on current pages I am optimising for a keyword or is it better to keep creating new pages and if we are creating new pages is it best to use an extension of the keyword on the new page – for example if we are optimising one page for ‘does voltage optimisation work’ would it then be worth creating a page optimised for ‘does voltage optimisation work in hotels’ for example and so on? I am guessing maybe both might help, this is just a question I have had from one of my clients.
On-Page Optimization | | TWSI1 -
Does the title tag on the home page affect sub-pages?
Hello. I am thinking of changing our home page title tag to include our two most valuable keywords from two of our sub-pages. Would this help the rankings of those two sub-pages? Thank you!
On-Page Optimization | | nyc-seo0 -
What's the maximum length (number of spaces) for the Alt Image Text?
I'm setting up alt image text for the images on our website and I'm wondering if there is a maximum number of spaces that should be used that field.
On-Page Optimization | | JillCS0 -
If a site has https versions of every page, will the search engines view them as duplicate pages?
A client's site has HTTPS versions of every page for their site and it is possible to view both http and https versions of the page. Do the search engines view this as duplicate content?
On-Page Optimization | | harryholmes0070 -
Do product pages need unique content or does having duplcate content hurt on those pages?
We are adding product rapidly to our website but this requires allowing duplicate to exist on our product pages of furniture-online.com. From an SEO standpoint do we need to make this content unique for each product. Since we aren't link building to specific product pages and we don't anticipate product pages being found in a search result, are we ok leaving the duplicate content in place and spending our dollars elsewhere?
On-Page Optimization | | gallreddy0 -
Another SEO's point of view
Hiya fellow SEO's I have been working on a site - www.hplmotors.co.uk and I must say it has become difficult due to flaws with the content management system . We are speaking with the web site makers to be able to add a unique title, description to all pages. I know what is wrong but I would also like some 2nd opinions on this and welcome any suggestions for the site. A burnt out seo 🙂 thanks
On-Page Optimization | | onlinemediadirect0