301 Redirects on Large Real Estate Website
-
Hi guys,We are about to move over to a new website and need advice on handling the 301 redirects.We have a large real estate website with around 12,000 pages, a lot of these are properties (about 10,000)On our old website, the url structure for each property is as follows -domainname.com/property/view?property=14863on our new site, the url structure isdomainname.com/properties/view/6137The property ID number is always different from old site to new. The way we see it, we have two options. a.) a manual redirect of each and every property url. A very very long jobb.) a folder level redirect, so redirect the 'property' folder on the old site into the 'properties' folder on new. The con with this one is we are not sure if this is the best route to take, if it is how we would go about it?Some advice would be really appreciated guys. I know there are some hyper intelligent SEO's in here and we need to make sure we handle this right!Many thanks in advance.Mark
-
This is true, you can wait for google to deindex them, but that can take 6 months or more.
You could also wait for the 404s to show up and check the referrer and then manually set up the redirect, but if you miss seeing them, you may also risk the linking site removing the link.
Another thing you could do is pull reports from GWMT and Bing WMT and Majestic to discover who is linking to which pages, and then start with those redirects, then watch out for the 404s and pick them up as you discover them.
If you do want to push google along with removing the old pages, you can do it by requesting them in WMT. 12,000 isn't really many, and last time I tried it, you can ask for 1,000 per day, but you have to do them one at a time. That means either a slow manual process or do it with a macro. I think I've had 20,000 or more deleted that way.
-
Hi Mark,
Considering that the old property IDs and new property IDs don't match up and you'd have to configure 1-to-1 redirects (with what sounds like a lot of manual work to get it right and potentially a very large .htaccess file), I'm going to ask a dumb question: why do you need to redirect all of the properties?
In cases like this, I invariably pull some data in to prioritize URLs. Namely, inbound link and direct/referral traffic data.
If a page is not linked to from any external subdomains and gets little or no direct or referral traffic, it's usually best to simply let it return a 404 once you've updated the site - Google will hit the 404 and de-index the page in due time, while the new page will (provided the new site has sound architecture and some authority to justify a deep crawl budget) get picked up.
The only justifiable reason to do a 1-to-1 301 redirect across the board for this many URLs, in my opinion, is if there is enough link equity / traffic to justify the work. Otherwise, Google knows how to handle 404s and they'll crawl/index the new property URLs in due time.
Best,
Mike -
Hey Alan,
Thanks loads for the advice there. Makes a lot of sense.
Problem I have is we do not have any kind of access to the old site. Nor the client having a good relationship with the agency who made the previous site.
I have run multiple crawls of the old site with Screaming Frog and Moz and I just cant get all the properties spidered. Out of the total amount of properties I have about one third of them, which of course can be redirected.
We made a final change to the url structure so the property address is added. The urls now look like the following -
OLD - domainname.com/property/view?property=14863
NEW - domainname.com/property/street-name-postcode/propertyid
The main problem we have and why I think it is not possible using mod rewrite, is the property ids are different on both sites. There is really nothing in common between the two URLs at all aside from /property/ and page title.
Any further advice would be very much appreciated Alan as its clear you have done jobs like this before.
Thanks,
Mark
-
If you have unix and shell access it should be a snap.
but as you're asking this question, you probably don't even know what "grep" is
Get a list of title and URLs from each site
mix them together
sort by title
this will tell you if there are duplicates or if you missed any
if the domain names are different search and replace them so they are the same
Manipulate the list so it is in redirect format
12,000 is not a lot. I worked on sites with several million.
Don't do a folder level redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects Backlinks?
I am wondering how many people still getting results from 301 redirects from high authority expired domains..? I read few case studies on detailed dot com where the shared some people still making big bucks from this strategy which i think 10beast also implemented few years back.? Is it worth it to test out 301 redirects to money sites.? How many people on the moz community here have implemented this strategy in practical and can share some insights and pros and cons of doing 301 redirects? Many Thanks.
Technical SEO | | asifseo0980 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
301 redirect homepage question
Hi If i have a homepage which is available at both www.homepage.com and www.homepage.com// should i 301 the // version to the first version. Im curious as to whether slashes are taking into consideration Thanks in advance
Technical SEO | | TheZenAgency0 -
301 redirect with Magento; still Page authority 0 after 6 weeks
Hi Mozzers! In December '14 I have execute a 301 redirect in the 'old' page in the admin of my Magento store. Now I was surprised to see that the Page Authority is still 0 of the new page 6 weeks after the execution. should I have seen the update of the PA on the new page already after 6 weeks of time? If yes, then I assume that my Magento didn't execute this properly? Old url: http://hippemamashop.nl/mama/boeken/fotoalbum.html
Technical SEO | | aznventure
New page: you will be redirected to the new page after clicking the old page Mm4uBEl0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
301 redirect blog posts from old URL to new one
I moved a wordpress blog from domain.com to domain.com/blog . I want to redirect the links in google from the old domain.com to the new one, but I also want to put a new site/application at domain.com..so I'm thinking an .htaccess 301 redirect at the root wouldn't work. Any tips?
Technical SEO | | callmeed0 -
Relocating Real Estate Agent - Redirect?
I'm a real estate agent relocating to a new market. I have a fairly strong site in TN. How do I transfer as much juice as possible to my new site in FL? I don't want my new site in Florida to rank for TN keywords.
Technical SEO | | DanBoyle760 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0