Why does my 301 show the old urls with new descriptions and titles?
-
Hi all,
We've just rebranded. The 301 appears to have worked well and moved the results and rankings onto the new domain. However a site:olddomain.com search in Google brings up about a hundred pages that have the new titles and descriptions but show the old urls - does anyone have any idea how to make the old domain disappear from the SERPS?
Many thanks,
Richard
-
Hi Paul,
Bing is much less confused, and we'll sit tight - but many thanks for the change of address tool suggestion - we had missed that!
Best,
Richard
-
Hi Mike,
Ok, that's pretty much what we thought but thanks for confirming your thoughts.
Best,
Richard
-
There's always a time lag while Google sorts out what's going on after a migration, panini. How long ago do the new version go live? Is Bing showing the same "confusion"?
The one additional hint you can give Google, in particular, is to use the "Change of Address" tool in Google Webmaster Tools to advise of the move to the new domain.
Other than that - patience for a couple weeks is your best friend. If only a hundred out of a larger number of pages still show the "confusion' my guess is this is just the usual dance associated with big G figuring things out after a large change like this.
Good luck & let us know how it plays out!
Paul
-
Hi Richard,
As long as your 301s are set up correctly, Google should eventually drop the old URL from the SERPs.
Make sure that you do not have a temporary redirect, 302, set up, which would prolong things.
If you want to try and speed things up, you could resubmit your sitemap using Goolge Webmaster Tools.
Hope this helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old pages not mobile friendly - new pages in process but don't want to upset current traffic.
Working with a new client. They have what I would describe as two virtual websites. Same domain but different coding, navigation and structure. Old virtual website pages fail mobile friendly, they were not designed to be responsive ( there really is no way to fix them) but they are ranking and getting traffic. New virtual website pages pass mobile friendly but are not SEO optimized yet and are not ranking and not getting organic traffic. My understanding is NOT mobile friendly is a "site" designation and although the offending pages are listed it is not a "page" designation. Is this correct? If my understanding is true what would be the best way to hold onto the rankings and traffic generated by old virtual website pages and resolve the "NOT mobile friendly" problem until the new virtual website pages have surpassed the old pages in ranking and traffic? A proposal was made to redirect any mobile traffic on the old virtual website pages to mobile friendly pages. What will happen to SEO if this is done? The pages would pass mobile friendly because they would go to mobile friendly pages, I assume, but what about link equity? Would they see a drop in traffic ? Any thoughts? Thanks, Toni
Technical SEO | | Toni70 -
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi All, I found one other discussion about the subject of PDFs and passing of PageRank here: http://moz.com/community/q/will-a-pdf-pass-pagerank But this thread didn't answer my question so am posting it here. This PDF: http://www.ccisolutions.com/jsp/pdf/YAM-EMX_SERIES.PDF is reported by GWT to have 38 links coming from 8 unique domains. I checked the domains and some of them are high-quality relevant sites. Here's the list: Domains and Number of Links
Technical SEO | | danatanseo
prodiscjockeyequipment.com 9
decaturilmetalbuildings.com 9
timberlinesteelbuildings.com 6
jaymixer.com 4
panelsteelbuilding.com 4
steelbuildingsguide.net 3
freedocumentsearch.com 2
freedocument.net 1 However, when I plug the URL for this PDF into OSE, it reports no links and a Page Authority if only "1". This is not a new page. This is a really old page. In addition to that, when I check the PageRank of this URL, the PageRank is "nil" - not even "0" - I'm currently working on adding links back to our main site from within our PDFs, but I'm not sure how worthwhile this is if the PDFs aren't being allocated any authority from the pages already linking to them. Thoughts? Comments? Suggestions? Thanks all!0 -
Showing duplicate content when I have canonical url set, why?
Just inspecting my sites report and I see that I have a lot of duplicate content issues, not sure why these two pages here http://www.thecheapplace.com/wholesale-products/Are-you-into-casual-sex-patch http://www.thecheapplace.com/wholesale-products/small-wholesale-patches-1/Are-you-into-casual-sex-patch are showing as duplicate content when both pages have a clearly defined canonical url of http://www.thecheapplace.com/Are-you-into-casual-sex-patch Any answer would be appreciated, thank you
Technical SEO | | erhansimavi0 -
I'm redesigning a website which will have a new URL format. What's the best way to redirect all the old URLs to the new ones? Is there an automated, fast way to do this?
For example, the new URL will be: https://oregonoptimalhealth.com/about_us.html while the old one's were like this: http://www.oregonoptimalhealth.com/home/ooh/smartlist_1/services.html I have redirect almost 100 old pages to the correct new page. What's the best and easiest way to do this?
Technical SEO | | PolarisMarketing0 -
Does it really matter to maintain 301 redirect after de-indexing of old URLs?
Today, I was reading latest blog post on SEOmoz blog about. Uncrawled 301s - A Quick Fix for When Relaunches Go Too Well This is very interesting study about 301 & How it useful to maintain traffic. I'm working on eCommerce website and I have done similar stuff on my website. I have big confusion to manage 301 redirect. My website generates new URLs due to following actions. Re-write dynamic URLs. Re-launch entire website on different eCommerce platform. [osCommerce to Magento Commerce] Re-name category. Trasfer one product from one category to another category. I'm managing my 301 redirect with old practice. Excel sheet data from Google webmaster tools and set specific new URLs for redirect. Hoooo... Now, I have 8.5K redirect in htaccess... And, I'm thinking it's too much. Can we remove old 301 redirect from htaccess or not? This is big question for me. Because, all pages are not hyperlink on external website. Google have just de-indexed old URLs and indexed new URLs. So, Is it require to maintain 301 redirect after Google process?
Technical SEO | | CommercePundit0 -
Why do I have one page showing as two url's?
My SEOMoz stats show that I have duplicate titles for the following two url's: http://www.rmtracking.com/products.php and http://www.rmtracking.com/products I have checked my server files, and I don't see a live page without the php. A while back, we converted our site from html to php, but the html pages have 301's and as you can see the page without the php is properly redirecting to the php page. Any ideas why this would show as two separate url's?
Technical SEO | | BradBorst0