Creating duplicate site for testing purpose. Can it hurt original site
-
Hello,
We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks.
may suggest - we need to work on live server, what we have planned
-
take exact replica of site and move to a test domain, but on live server
-
Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt
-
Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain
The process upgradation and new tools may take 1 - 1.5 month....
Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
-
-
Thanks, am using it through Password Protected & meta noindex tag
Its been kept out of search engine crawl !!
-
Hey Gagan,
So I think you're question is will content on your staging site still get indexed despite using robots.txt? The answer is yes, sometimes that does happen especially if a lot of people link to it. The best way to keep content out of the index would be to use the meta robots tag with noindex, nofollow. Search engines are much better about adhering to those than robots.txt.
Let us know if you run into any problems!
-Mike
-
Hi Gagan,
Google are generally more than happy for sites to test new pages, layouts and functionality. They even have some free tools for that purpose.
Content Experiments
https://support.google.com/analytics/answer/1745147?ref_topic=1745207&rd=1
I'm not sure about the viability of of using Content Experiments to test a whole new site, but it would be worth looking into.
Let us know how you get on.
Neil.
-
Ahaa.. Thanks Mr. Robert for your views
However, does any kind of duplicate url can still occur - can google can still crawl the url despite been blocked through robots - can the original running site can suffer in any way, if we create duplicate site
Its a content based site - covering Auto reviews, updates with news, forum & blog updates. There is no ecommerce shopping or products involved
Our tentative time frame to add on features, test all changes and do major upgrade for latest version of cms will be approx 45 days. Do you feel any issue - if both original site and a duplicate one on test domain (despite blocked by robots), but on real time server goes on simultaneously for that period.
Also - you referred other way of testing changes - is it possible to share them ?
-
Gagan
I think this is a great and interesting question. First, you are adding functionality, etc. to a site and you are curious as to the effect of that on visitors to the site once they are on it. This is data anyone in SEO should want to see for their sites.
I would first say that you need to define the test period (assuming you already know what you want to measure) for the site. If it is a week for example, I do not think you need worry about whether a site with three major engines blocked will in some way run into duped content issues. (NOTE: If this is a large site and/or one with a critical revenue need - one that cannot afford to have any type of slight but temporary downturn - I would look for another way to test the changes. Even if I was sure there were no other issues.)
I am assuming that if an ecommerce site for example, there will be the ability for a shopper to purchase on both, etc.
I would not run the test for any long period of time for a site that creates leads, revenue, etc. as I think it could cause customer confusion which can be more critical than duped content.
Let us know how it works out,
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Want to remove a large amount of links from spam sites. SEO company says we will lose a lot of link juice?
Hi, We have a lot of links that have a spam score above 30% and 60%. I don't know if someone has spammed our website. However our SEO company has said we should remove these carefully over a period of 3 months while they add new good links. I don't quite trust this advice. Are they trying to get more business?? They have put doubt in our mind. Can anyone please shed any light on this?? Thank you
White Hat / Black Hat SEO | | YvonneDupree0 -
Duplicate Content does effect
Hey there, I am Doing SEO For one of my client For mortgage Company. As i have Checked in Other mortgage sites, they used to have Same Content/Details, In all Websites, & my client site have also Some of them, So my Question is as per Google If there Duplicate Content, it will be Got penalize, But as i see Ranking & traffic For competitor site, They have Duplication, then also Rank For 1st page,. what is Reason behind? so i also Implement/Execute our site With that same content?? or i'll Got penalize?? Thnx in advance
White Hat / Black Hat SEO | | iepl20010 -
More than 450 Pages Created by a hacker
Hi Moz Community, I am in charge of the Spanish SEO for an international company, related to security. A couple of months ago, I realized that my Spanish/keywords/post all vanished from Google, Yahoo, Bing and Duckduckgo. Then I noticed that somebody in command of the main website used a disavow all! I was in shock, as all of you can imagine. Knowing that all the inbound links were spam score under 4, highly relevant and so. Later on, I was informed that the website was hacked and somebody took that action. Of course, it did not solved the issue. I continue researching and found those pages - "Online%20Games%20-%20Should%20Parents%20Worry%20Or%20Celebrate%3F" - all of them like this one. I informed the owner of the website - he is not my client - my client is the Spanish Manager. They erased the pages, of course plus sent all those, to avoid the 404 responses, to the homepage with a 301. My heart stopped at that point! I asked them to send all those with a redirect 301 to a new hidden page with nofollow and noindex directives. We recover, my keywords/pages are in the first page again. Although the DA fell 7 points and no inbound links for now. I asked for the disavow file "to rewrite it", not received yet. Any better ideas? Encountered a similar issue? How did you solved it?
White Hat / Black Hat SEO | | Mª Verónica B.
Thanks in advance.0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Spam linking site how to report
I have a spam linking site that is generation thousans of links to my site. Even if i have a good link background, this is the only spammy i have, each week number of links comings from it increases by 500 , i know have 3000 links for that site and 1800 for other sites, but that one keeps growing What should i do, i dont want that link it is imposible to remove as webmaster does not respond
White Hat / Black Hat SEO | | maestrosonrisas0 -
Should we add our site to Google Webmaster Tools
Hello, Should we add our site nlpca(dot)com to google webmaster tools? Everything's very white hat but we do have a section on each of our 4 sites for "Our other Sites" that link to the others. It's been there for many years. We're looking for clues as to why we've dropped in rank Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Competitors have local "mirror" sites
I have noticed that some of my competitors have set up "mirror" homepages set up for different counties, towns, or suburbs. In one case the mirror homepages are virtually identical escept for the title and in the other case about half of the content id duplicate and the other half is different. both of these competors have excellent rankings and traffic. I am surprised about these results, does anyone care to comment about it and is this a grey hat technique that is likely to be penalized eventually. thx Diogenes
White Hat / Black Hat SEO | | diogenes0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1