Whats the best way to revive a directory that was 301'd and now I want to remove that?
-
Last year i 301'd one of my directories on my site, pointing everything to a different directory. Long story short I am going to sell this product line again and would like to just remove the 301 to that original directory, but I am reading that the 301s are also cached in most browsers for a long time. Has anyone successfully done this and if you did what was it that you had to do?
Thanks
Mike
-
I agree with @Jimmy. Remove the 301's, ensure internal and some external links point to the new URLs and update the XML sitemap. As for the browsers saving the 301s, here what I found:
- IE7, IE8, Android 2.3.4 do not cache at all.
- Firefox 18.0.2, Safari 5.1.7 (on Windows 7), and Opera 12.14 all cache, and clear the cache on browser restart.
- IE10 and Chrome 25 cache, but do not clear on browser restart
There was a write up about redirecting 301's using 302 but I recommend using the above method instead... http://goo.gl/Z1uEr
-
Remove the 301 then, get links to the new site. Add a sitemap and make sure there are no duplication between the two sites of content. If your 301 was listed in Webmaster tools then remove it and also set the new as a preferred site and verify it.
Also generate an XML sitemap and submit that to Google, Bing, Yahoo etc
That should do it... It won't happen over night...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When rebranding, what's the best thing to do with the new domain before rebranding?
A. Do nothing
Intermediate & Advanced SEO | | Maxaro.nl
B. Redirect to legacy site (current domain)
C. Create a placeholder with information about the rebranding
D. Other... What do you think is best?0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
What would be best way to transition from mobile website to responsive
We have a mobile website (mobile.website.com) that mirror our desktop site (www.website.com) with +100 000 pages. We have an alternate tag on our desktop to our mobile site and a user agent detect that redirect mobile traffic to our mobile site Our mobile site is no index and has a canonical to our desktop. Everything works pretty well, the mobile website is not index and only show up in SERP when a user make a search from a mobile. Our main website is now responsive and we would like to kill our mobile site without compromising our traffic. We know that a slight speed change or content change can affect our traffic, what would be the best way to do that? Big bang: redirect all mobile URL to desktop, remove user agent detect and remove alternate tag on desktop Semi Big bang: remove user agent detect and remove alternate tag on desktop and see how the traffic react before redirecting Progressive: remove the user agent detect and the alternate tag on some section of the website to see how the traffic react Other ? Anyone has any experience with that? Thanks and let me know if anything is not clear.
Intermediate & Advanced SEO | | Digitics0 -
Manual Removal Request Versus Automated Request to Remove Bad Links
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google. Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask? I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome. Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
Best Way to Incorporate FAQs into Every Page - Duplicate Content?
Hi Mozzers, We want to incorporate a 'Dictionary' of terms onto quite a few pages on our site, similar to an FAQ system. The 'Dictionary' has 285 terms in it, with about 1 sentence of content for each one (approximately 5,000 words total). The content is unique to our site and not keyword stuffed, but I am unsure what Google will think about us having all this shared content on these pages. I have a few ideas about how we can build this, but my higher-ups really want the entire dictionary on every page. Thoughts? Image of what we're thinking here - http://screencast.com/t/GkhOktwC4I Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
How do you 301 redirect URLs with a hashbang (#!) format? We just lost a ton of pagerank because we thought javascript redirect was the only way! But other sites have been able to do this – examples and details inside
Hi Moz, Here's more info on our problem, and thanks for reading! We’re trying to Create 301 redirects for 44 pages on site.com. We’re having trouble 301 redirecting these pages, possibly because they are AJAX and have hashbangs in the URLs. These are locations pages. The old locations URLs are in the following format: www.site.com/locations/#!new-york and the new URLs that we want to redirect to are in this format: www.site.com/locations/new-york We have not been able to create these redirects using Yoast WordPress SEO plugin v.1.5.3.2. The CMS is WordPress version 3.9.1 The reason we want to 301 redirect these pages is because we have created new pages to replace them, and we want to pass pagerank from the old pages to the new. A 301 redirect is the ideal way to pass pagerank. Examples of pages that are able to 301 redirect hashbang URLs include http://www.sherrilltree.com/Saddles#!Saddles and https://twitter.com/#!RobOusbey.
Intermediate & Advanced SEO | | DA20130 -
Google Semantic Search: Now I'm really confused
I'm struggling to understand why I rank for some terms and not for other closely related ones. For example: property in Toytown but NOT properties in toytown property for sale in Toytown but NOT property for sale Toytown NOR properties for sale Toytown. My gut instinct is that I don't have enough of the second phrasing as inbound link anchor text -- but didn't Penguin/Panda make all that obsolete?
Intermediate & Advanced SEO | | Jeepster0 -
Is there a way to find out how many 301 redirects a site gets?
If you do a search on "personal loans" on Google the first non-local/personal result is onemainfinanical.com. They have far fewer links showing in OSE and YSE than the other sites. I know onemainfinanical.com is a Citbank site so I'm trying to determine if they are ranking so high b/c they are getting 301 link juice from old Citibank.com authority pages. Is there anyway to check to see what sites are sending link juice through a 301 redirect instead of a direct link?
Intermediate & Advanced SEO | | fthead90