Development site is live (and has indexed) alongside live site - what's the best course of action?
-
Hello Mozzers,
I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection?
Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess...
Thanks in advance for your help!
-
Very pleased to have been of assistance
heres links to older threads where i asked similar before, for further verification and credit to those that originally helped me:
-
Thanks Amelia - yes you're definitely on the right lines - Dan's response below is v helpful too, that's for sure. I do struggle with developers from time to time, so teaching myself coding and so on via codeacademy, etc. - learnt at uni many years ago but v out of date! Will come in useful for SEO too.
-
Many thanks Dan - much appreciated - that process there makes perfect sense even though in my case too :)))) I will report back on progress in a month or so...
-
Yes a great answer there from Dan - and thanks for your useful input - good point re: not relying on robots.txt alone!
-
Thanks Robert and for the extra comments too !
I cant remember which Mozzer helped me with the above in the first place who should be credited but ill track down the original thread and add it to this post since also contains further info and discussion
All Best
Dan
-
Dan,
This is a very good answer. Just to emphasize, probably the most important piece with a "dev" site is the last one Dan mentions: Password protection. Once you clean up the issue, add it then you should not have the issue going forward.
Even with robots.txt on our dev sites and our design studio, we have had pages end up on the SERPS. Because of the DA of our design studio (where clients go to approve a comp, etc.) we recently had a new political client's comp ranking for a search term on page one. (Ahead of their actual site (we were building another to replace it). So, even with robots.txt, there is still no guarantee it will not be crawled.
Adding password protection will assist in that.Lastly, if you have someone building you a site, and they say they do not want to take down the dev version after your launch, tell them you do not wish to pay them. It will go down. That is unreasonable. I cannot think of a reason to keep the dev version live once the client site launches.
Again, good job Dan.
-
Hi
I'm in a similarish situation with a clients site.
Their situation is that the dev site is on a subdomain i.e. staging.domain.com and they want to keep the staging area active for demonstrating future development work, so situation may be slightly different from yours.
They have now blocked via robot.txt but that's like shutting the stable door after the horse has already bolted.
I asked Moz Q&A a few months ago and got the below answer from a few very helpful and wize Mozzers
-
Setup a completely different Webmaster Tools account unrelated to the main site, so that there
is a new W.T account specific to the staging area sub-domain -
Add a robots.txt on the staging area sub domain site that disallows all pages and all crawlers
OR use the no-index meta tag on all pages but Google much prefers Robots.txt usage for this
Note: Its very important when you update the main site it does not include or push out these files and
instructions too (since that would result in main site being de-indexed)-
Request removal of all pages in GWT. Leave the form field for the page to be removed blank,
since will remove all subdomain pages -
After about 1 month OR you see that the pages are all out of the Search Engine listings (SERPS),
and Google has spidered and seen the robots.txt, then put up a password on the entire staging
site.
Hope that helps
All Best
Dan
-
-
Hi Luke,
I'm interested in other responses to this question...
If I was in your position after seriously berating the dev I would make sure you disallow the dev site in your robots.txt and use webmaster tools to remove the URLs from the index. Then I would password protect the dev site so the search engines couldn't get there even if they try.
Like I say, I'm interested in other responses! This is what I would do, but I don't really know if it's definitely the right thing to do. Does anyone else have anything to add?
Best of luck - its crappy when someone else's error cocks up your work: when our site launched for the first time our IT department screwed up on a monumental scale by getting the DNS settings wrong.
Amelia
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Canonical URL's For Two Domains
We have two websites, one we use for Google PPC (website 1) and one (website 2) we use for everything else. The reason is we are in an industry that Google Adwords doesn't like, so we built a whole other website that removes the product descriptions as Google Adwords doesn't approve of many of them (nutrition). Right now we have that Google Adwords approved website (website 1) no-index/no-follow because we didn't want to run into potential duplicate content issues in free search, but the issue is we can't submit it to Google Shopping...as they require it to be indexable. Do you think removing the no-index/no-follow from that website 1 and adding canonical URL's pointing to website 2 would resolve this issue (being able to submit it to Google Shopping) and not cause any problems with duplicate content? I was thinking of adding the canonical tag to all pages of website 1 and point it to website 2. Does that make sense? Do you think that would work?
Intermediate & Advanced SEO | | vetofunk0 -
Sitemap Indexed Pages, Google Glitch or Problem With Site?
Hello, I have a quick question about our Sitemap Web Pages Indexed status in Google Search Console. Because of the drastic drop I can't tell if this is a glitch or a serious issue. When you look at the attached image you can see that under Sitemaps Web Pages Indexed has dropped suddenly on 3/12/17 from 6029 to 540. Our Index status shows 7K+ indexed. Other than product updates/additions and homepage layout updates there have been no significant changes to this website. If it helps we are operating on the Volusion platform. Thanks for your help! -Ryan rou1zMs
Intermediate & Advanced SEO | | rrhansen0 -
What's your Link Building Tactics?
So my question is: What's your Link Building Tactic. I always have a bit of a problem building links for my websites. Also Do you use some kind of a tool? If yes can you reccomend it?
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
Is it possible to lose rank because my site's IP changed?
I manage a site on the 3dCart e-commerce platform. I recently updated the SSL certificate. Today, when I tried to log-in via FTP, I couldn't connect. The reason I couldn't connect was because my IP had changed. Last week the site experienced almost across the board rankings drops on lmost every important keyword. Not gigantic drops, a lot just lost 2-4 postiions, but that's a lot when you were #2 and you drop to #4 or # 6. Initially I thought it was because I was attempting to markup my product pages using structured data following guidelines from schema.org. I am not a coder so it was a real struggle, especially trying to navigate 3dCart's listing templates. I thought the rankings drops were Google slapping me for bad code, but now I wonder....could I really have dropped down because of that IP address change? Does anyone have a take on this? Thanks!
Intermediate & Advanced SEO | | danatanseo0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Google Indexed the HTTPS version of an e-commerce site
Hi, I am working with a new e-commerce site. The way they are setup is that once you add an item to the cart, you'll be put onto secure HTTPS versions of the page as you continue to browse. Well, somehow this translated to Google indexing the whole site as HTTPS, even the home page. Couple questions: 1. I assume that is bad or could hurt rankings, or at a minimum is not the best practice for SEO, right? 2. Assuming it is something we don't want, how would we go about getting the http versions of pages indexed instead of https? Do we need rel-canonical on each page to be to the http version? Anything else that would help? Thanks!
Intermediate & Advanced SEO | | brianspatterson0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9