Development site is live (and has indexed) alongside live site - what's the best course of action?
-
Hello Mozzers,
I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection?
Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess...
Thanks in advance for your help!
-
Very pleased to have been of assistance
heres links to older threads where i asked similar before, for further verification and credit to those that originally helped me:
-
Thanks Amelia - yes you're definitely on the right lines - Dan's response below is v helpful too, that's for sure. I do struggle with developers from time to time, so teaching myself coding and so on via codeacademy, etc. - learnt at uni many years ago but v out of date! Will come in useful for SEO too.
-
Many thanks Dan - much appreciated - that process there makes perfect sense even though in my case too :)))) I will report back on progress in a month or so...
-
Yes a great answer there from Dan - and thanks for your useful input - good point re: not relying on robots.txt alone!
-
Thanks Robert and for the extra comments too !
I cant remember which Mozzer helped me with the above in the first place who should be credited but ill track down the original thread and add it to this post since also contains further info and discussion
All Best
Dan
-
Dan,
This is a very good answer. Just to emphasize, probably the most important piece with a "dev" site is the last one Dan mentions: Password protection. Once you clean up the issue, add it then you should not have the issue going forward.
Even with robots.txt on our dev sites and our design studio, we have had pages end up on the SERPS. Because of the DA of our design studio (where clients go to approve a comp, etc.) we recently had a new political client's comp ranking for a search term on page one. (Ahead of their actual site (we were building another to replace it). So, even with robots.txt, there is still no guarantee it will not be crawled.
Adding password protection will assist in that.Lastly, if you have someone building you a site, and they say they do not want to take down the dev version after your launch, tell them you do not wish to pay them. It will go down. That is unreasonable. I cannot think of a reason to keep the dev version live once the client site launches.
Again, good job Dan.
-
Hi
I'm in a similarish situation with a clients site.
Their situation is that the dev site is on a subdomain i.e. staging.domain.com and they want to keep the staging area active for demonstrating future development work, so situation may be slightly different from yours.
They have now blocked via robot.txt but that's like shutting the stable door after the horse has already bolted.
I asked Moz Q&A a few months ago and got the below answer from a few very helpful and wize Mozzers
-
Setup a completely different Webmaster Tools account unrelated to the main site, so that there
is a new W.T account specific to the staging area sub-domain -
Add a robots.txt on the staging area sub domain site that disallows all pages and all crawlers
OR use the no-index meta tag on all pages but Google much prefers Robots.txt usage for this
Note: Its very important when you update the main site it does not include or push out these files and
instructions too (since that would result in main site being de-indexed)-
Request removal of all pages in GWT. Leave the form field for the page to be removed blank,
since will remove all subdomain pages -
After about 1 month OR you see that the pages are all out of the Search Engine listings (SERPS),
and Google has spidered and seen the robots.txt, then put up a password on the entire staging
site.
Hope that helps
All Best
Dan
-
-
Hi Luke,
I'm interested in other responses to this question...
If I was in your position after seriously berating the dev I would make sure you disallow the dev site in your robots.txt and use webmaster tools to remove the URLs from the index. Then I would password protect the dev site so the search engines couldn't get there even if they try.
Like I say, I'm interested in other responses! This is what I would do, but I don't really know if it's definitely the right thing to do. Does anyone else have anything to add?
Best of luck - its crappy when someone else's error cocks up your work: when our site launched for the first time our IT department screwed up on a monumental scale by getting the DNS settings wrong.
Amelia
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best SEO way to categorize products on an ecommerce site
What is the best way for SEO to set up categories for an ecommerce site selling beauty products. I have currently built my product categories so that if a person looks under the hydration category they find our body lotion, but also if they look under the body section of products they also will find the same body lotion. Is this a problem for SEO? I think it helps the customer find the product.
Intermediate & Advanced SEO | | Kuhliff0 -
Huge organic drop following new site go live
Hi Guys, I am currently working on a site that's organic traffic suffered ( and is still suffering ) a huge drop in organic traffic. From a consistent 3-400 organic visits a day to almost zero. This happened as soon as the new site went live. I am now digging to find out why. 301s were put in place ( over 2, 500 over them ) and there are still over 1,100 outstanding after review search console this morning. Having looked at the redirect file that was put in place when the new site went live, it all look OK, apart from the redirects look like this... http://www.physiotherapystore.com/ to http://physiotherapystore.com/ Where the new URL is missing www. - I am concerned this is causing a large duplicate issue as both www. and non www. work fine. I am right to have concern or is this something not to worry about?
Intermediate & Advanced SEO | | HappyJackJr0 -
Site's pages has GA codes based on Tag Manager but in Screaming Frog, it is not recognized
Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP
Intermediate & Advanced SEO | | jayoliverwright0 -
What's the best internal linking strategy for articles and on-site resources?
We recently added an education center to our site with articles and information about our products and industry. What is the best way to link to and from that content? There are two options I'm considering: Link to articles from category and subcategory pages under a section called "related articles" and link back to these category and subcategory pages from the articles: category page <<--------->> education center article education center article <<---------->> subcategory page Only link from the articles to the category and subcategory pages: education center article ---------->> category page education center article ---------->> subcategory page Would #1 dilute the SEO value of the category and subcategory pages? I want to offer shoppers links to more information if they need it, but this may also take them away from the products. Has anyone tested this? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0 -
Best way to host multiple sites for maximum seo
We have over 100 websites we built for clients that we currently host on 1 shared godaddy hosting account. They each have a link to us but since they are all under one shared account, we feel that we are not maximizing the inbound link potential. I've looked into c class hosting but found that either the ip's were flagged as spam, or they shared nameservers which defeats the purpose. I've also been told that since the c class ip's a hosting company gives to you are all owned by them, that also defeats the purpose. Anyone have any solutions besides opening 130 accounts with different hosting companies? Also, will it make any difference changing existing sites onto different hosts now or are they already tainted?
Intermediate & Advanced SEO | | seopet0 -
How long a domain's bad reputation last?
I catched a dropped domain with a nice keyword, but poor reputation. It used to have some malware on the site and WOT (site review tool available at Chrome among others) has very negative reviews tied to the site. I guess that Google has to have records about that as well, because Chrome used to prompt a warning when I entered the site. My question is: how long will the bad reputation last if I build a legitimate website there?
Intermediate & Advanced SEO | | zapalka0 -
Is there a development solution for AJAX-based sites and indexing in Bing/Yahoo?
Hi. I have outlined a solution for an AJAX-based site in order to rank preserve indexing and rank in Google using the hashbang. I'm curious if anyone has some insight for doing the same for Bing/Yahoo! (a development question)
Intermediate & Advanced SEO | | OveritMedia0