How to fix issues from 301s
-
Case:
We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits.
Issue:
My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline.
Proposed resolution:
I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up.
Question 1:
Is the assumption that the decline could be because of missed authority signals reasonable?
Question 2:
Could the proposed solution be harmful?
Question 3:
Will the proposed solution be adequate to resolve the issue?
Any help would be sincerely appreciated.
Thank you in advance,
David
-
I did run a search on our old pages in the SERPs and found a large number of them are still showing. I also found most of our new pages, some where both the old and new were represented. I have also seen a lot of our positions go from page one to not in the top 100, these are all from pages which were 301ed to a nearly exact replica in the new version. I had originally thought Google had hit them, but not updated their listing to the new version. I am now thinking that they are just being ignored, and have not had their 301 picked up.
-
Hi David,
Can you run a: site:yourwebsite.com search in Google and report back? ie. find some of those pages and then check the cache date on them if the URLs are the same? Similarly, you can check the URL structure if that's changed by combing through all the google results of your sites pages in the SERP to see. If old URLs that have been 301'd exist, then it hasn't been recrawled and updated.
Once we know whether its an issue of them not having been recrawled, you'll better understand what the possible issue is and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Question - issue
A while back we had a 'bleed' on one of our sites, which basically meant one of our sites started to leak across pages to another and that site started to rank for the same pages and now we have hundreds of pages ranking for urls that do not exists. It's hard to explain, bare with me. If you were to click on the cached view in Google for the ranked page it would show you the main site, but if you were to click it as usual, then you would be taken to the site but a 404 would show as the intended page was not for that site. We believe we fixed the 'bleed' and have setup 301s for all the affected pages to go to the home page for the site it affected. But these pages have not been removed from Google, which we thought a 301 would do. So we still have hundreds of pages being ranked but are redirected to the home page. Why hasn't these pages been removed?
Intermediate & Advanced SEO | | JH_OffLimits0 -
Any downside to a whole bunch of 301s?
I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages. My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https. Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Canonical url issue
Canonical url issue My site https://ladydecosmetic.com on seomoz crawl showing duplicate page title, duplicate page content errors. I have downloaded the error reports csv and checked. From the report, The below url contains duplicate page content.
Intermediate & Advanced SEO | | trixmediainc
https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=40&brands=66&click=brnd And other duplicate urls as per report are,
https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&click=colorsu&brands=66 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&brands=66&click=brnd But on every these url(all 4) I have set canonical url. That is the original url and an existing one(not 404). https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=0 Then how this issues are showing like duplicate page content. Please give me an answer ASAP.0 -
Can use of the id attribute to anchor t text down a page cause page duplication issues?
I am producing a long glossary of terms and want to make it easier to jump down to various terms. I am using the<a id="anchor-text" ="" attribute="" so="" am="" appending="" #anchor-text="" to="" a="" url="" reach="" the="" correct="" spot<="" p=""></a> <a id="anchor-text" ="" attribute="" so="" am="" appending="" #anchor-text="" to="" a="" url="" reach="" the="" correct="" spot<="" p="">Does anyone know whether Google will pick this up as separate duplicate pages?</a> <a id="anchor-text" ="" attribute="" so="" am="" appending="" #anchor-text="" to="" a="" url="" reach="" the="" correct="" spot<="" p="">If so any ideas on what I can do? Apart from not do it to start with? I am thinking 301s won't work as I want the URL to work. And rel=canonical won't work as there is no actual page code to add it to. Many thanks for your help Wendy</a>
Intermediate & Advanced SEO | | Chammy0 -
Redirecting Canonical 301s and Magento Website
I have an issue with a client's website where it has 3700+ pages, but roughly half of them are duplicates. Thankfully, the only difference between the original and the duplictes is the "?print" at the end of each URL (I suppose this is Magento's way of making a printable page version of the same page. I don't know, I didn't build it.) My questions is, how can I get all the pages like this http://www.mycompany.com/blah.html?print to redirect to pages like this... http://www.mycompany.com/blah.html Also, do they NEED to be Canonical, or will a 301 redirect be sufficient. Also, after having done this, if anybody knows, is there a way I can turn that feature off in Magento, because we're expanding our product line, and I don't want to have to keep chasing after these "?print" pages after the fact.
Intermediate & Advanced SEO | | ClifThompson0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0