How to fix issues from 301s
-
Case:
We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits.
Issue:
My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline.
Proposed resolution:
I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up.
Question 1:
Is the assumption that the decline could be because of missed authority signals reasonable?
Question 2:
Could the proposed solution be harmful?
Question 3:
Will the proposed solution be adequate to resolve the issue?
Any help would be sincerely appreciated.
Thank you in advance,
David
-
I did run a search on our old pages in the SERPs and found a large number of them are still showing. I also found most of our new pages, some where both the old and new were represented. I have also seen a lot of our positions go from page one to not in the top 100, these are all from pages which were 301ed to a nearly exact replica in the new version. I had originally thought Google had hit them, but not updated their listing to the new version. I am now thinking that they are just being ignored, and have not had their 301 picked up.
-
Hi David,
Can you run a: site:yourwebsite.com search in Google and report back? ie. find some of those pages and then check the cache date on them if the URLs are the same? Similarly, you can check the URL structure if that's changed by combing through all the google results of your sites pages in the SERP to see. If old URLs that have been 301'd exist, then it hasn't been recrawled and updated.
Once we know whether its an issue of them not having been recrawled, you'll better understand what the possible issue is and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shall I hide short product review texts from customers (to avoid google panda/quality issues)?
About 30% of product reviews that the clients of our ecommerce store submitted in the last 10 years are 3 words or less (we did not require any minimum length). Would you recommend to hide those very short review texts? Where to draw the limit?
Intermediate & Advanced SEO | | lcourse
Numeric star rating would still go into our accumulated product rating. My only concern here is what impact it may have on google ranking.
To give some context, the site has for a long time some panda/phantom related issues where there are no obvious reasons that we could point to.0 -
Do I need to do 301s in this situation?
We have an e-commerce site built on Magento 2. We launched a few months ago, and had about 2K categories. The categories got indexed in Google for the most part. Shortly after launch, we decided to go with SLI for search and navigation because the native search/navigation was too slow given our database. The navigation pages are now hosted navigation pages; meaning, the URLs have changed and they are now hosted by SLI. I have done 301s for the most popular categories, but I didn't do 301s for all categories as we have to go through each category one-by-one and map it to the correct navigation page. Our new category sitemap only lists the new SLI category URLs. Will the fact that we have not 301'd all of our former categories hurt us as far as SEO? Do I have to do 301 redirects for all former category pages?
Intermediate & Advanced SEO | | kevin_h0 -
I have an authority site with 90K visits per month. Now I have to change from non www to www. Will incur in any SEO issues while doing that? Could you please advice me on the best steps to follow to do this? Thank you very much!
Because I want to increase site speed, Siteground (my hosting) suggested I use Cloudflare Plus which needs my site to have www in order to work. I'm also using a cloud hosting. Im a bit scared of doing this, and thus decided to come to the community. I used MOZ for over 6 months now and love the tool. Please help me make the best possible decisions and what steps to follow. It would be much appreciated. Thank you!
Intermediate & Advanced SEO | | Andrew_IT0 -
Manage category pages and duplicate content issues
Hi everybody, I am now auditing this website www.disfracessimon.com
Intermediate & Advanced SEO | | teconsite
this website has some issues with canonicals and other things. But right now I have found something that I would like to know your opinion. When I was checking parts of the content in google to find duplicate content issues I found this: I google I searched: "Chaleco de streck decorado con botones" and found First result: "Hombre trovador" is the one I was checking -> Correct
The following results are category pages where the product is listed in. I was wondering if this could cause any problem related with duplicated content. Should I no index category pages or should I keep it?
The first result in google was the product page. And category pages I think are good for link juice transfer and to capture some searchs from Google. Any advice? Thank you0 -
Unnecessary 301s?
hi mozzers, I'm doing an audit on a website. I detected over 60 301s of this nature: www.example.com/help 301d to www.example.com/help/. I believe these are completely useless and increase page load time. Am I right? should i kill those 301s? Thanks
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
How can I fix "Too Many On Page Links"?
One of the warnings from SEO Moz says that we have "too many on page links" on a series of pages on my website. The pages it's giving me these warnings on are on my printing sample pages. I'm assuming that it's because of my left navigation. You can see an example here: http://www.3000doorhangers.com/door-hanger-design-samples/deck-and-fence-door-hanger-samples/ Any suggestions on how to fix this warning? Thanks!
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Would having a bilingual sitemap listed in Google cause issues?
For a bilingual site do we put the sitemap in Google Webmaster in both the languages? Would list cause any issues?
Intermediate & Advanced SEO | | Francis_GlobalMediaInsight0