Creating duplicate site for testing purpose. Can it hurt original site
-
Hello,
We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks.
may suggest - we need to work on live server, what we have planned
-
take exact replica of site and move to a test domain, but on live server
-
Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt
-
Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain
The process upgradation and new tools may take 1 - 1.5 month....
Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
-
-
Thanks, am using it through Password Protected & meta noindex tag
Its been kept out of search engine crawl !!
-
Hey Gagan,
So I think you're question is will content on your staging site still get indexed despite using robots.txt? The answer is yes, sometimes that does happen especially if a lot of people link to it. The best way to keep content out of the index would be to use the meta robots tag with noindex, nofollow. Search engines are much better about adhering to those than robots.txt.
Let us know if you run into any problems!
-Mike
-
Hi Gagan,
Google are generally more than happy for sites to test new pages, layouts and functionality. They even have some free tools for that purpose.
Content Experiments
https://support.google.com/analytics/answer/1745147?ref_topic=1745207&rd=1
I'm not sure about the viability of of using Content Experiments to test a whole new site, but it would be worth looking into.
Let us know how you get on.
Neil.
-
Ahaa.. Thanks Mr. Robert for your views
However, does any kind of duplicate url can still occur - can google can still crawl the url despite been blocked through robots - can the original running site can suffer in any way, if we create duplicate site
Its a content based site - covering Auto reviews, updates with news, forum & blog updates. There is no ecommerce shopping or products involved
Our tentative time frame to add on features, test all changes and do major upgrade for latest version of cms will be approx 45 days. Do you feel any issue - if both original site and a duplicate one on test domain (despite blocked by robots), but on real time server goes on simultaneously for that period.
Also - you referred other way of testing changes - is it possible to share them ?
-
Gagan
I think this is a great and interesting question. First, you are adding functionality, etc. to a site and you are curious as to the effect of that on visitors to the site once they are on it. This is data anyone in SEO should want to see for their sites.
I would first say that you need to define the test period (assuming you already know what you want to measure) for the site. If it is a week for example, I do not think you need worry about whether a site with three major engines blocked will in some way run into duped content issues. (NOTE: If this is a large site and/or one with a critical revenue need - one that cannot afford to have any type of slight but temporary downturn - I would look for another way to test the changes. Even if I was sure there were no other issues.)
I am assuming that if an ecommerce site for example, there will be the ability for a shopper to purchase on both, etc.
I would not run the test for any long period of time for a site that creates leads, revenue, etc. as I think it could cause customer confusion which can be more critical than duped content.
Let us know how it works out,
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Server and multiple sites
We have multiple sites selling similar products in different ways but have always kept them separate on the off chance that google does not like it or they penalize one site. We have always put them on different servers but now thinking for performance as they are on shared hosting to put them on a single server which would be our own but we do not know the SEO considerations. We can assign multiple IPs to a server but I am not 100% sure whether there is still a negative impact of running multiple sites on the same server even if from a different IP. Any help would be appreciated, what I am really asking is could if they are on the same server with different IP's be still linked together by google?
White Hat / Black Hat SEO | | BobAnderson0 -
Can a Self-Hosted Ping Tool Hurt Your IP?
Confusing title I know, but let me explain. We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates. This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP. Thoughts?
White Hat / Black Hat SEO | | David-Kley0 -
How Can I Safely Establish Homepage Relevancy With Internal Keyword Links?
My website has roughly 1000-2000 pages. However, our homepage is lacking relevancy as to what it is about. One way that I'd like to tackle this problem, is by updating many of our pages with internal linking. I often hear, use exact keyword links with caution, but have assumed this mainly referred to external backlinks. Would it be a disaster to set up our single most relevant keyword on about 300 pages and point it to our homepage? There are breadcrumbs on our site, but the home link uses an image (It's a picture of a house, if you're curious.) Am I better off just to change that to our most relevant keyword? I could use any advice on internal links for establishing better homepage relevancy. Thank you!
White Hat / Black Hat SEO | | osaka730 -
PDF Sharing sites - scribd/dropbox/edocr/etc Cleaning Up SEO History
Howdy, Whilst in the process of cleaning up a new clients seo profile and have encountered a lot of techniques I am uncomfortable with and in my opinion should be removed. One technique I have not seen before is using a load of pdf sharing and video sites. The domains have high DA ratings, but to me the intention is highly questionable. The sites include: https://www.dropbox.com/s/tuxb8w1qowcm27i/Looking for boiler spares-geniune parts and consumables.pdf?dl=0 http://www.scribd.com/doc/241542076/Looking-for-Boiler-Spares-geniune-Parts-and-Consumables http://www.divshare.com/download/26207602-569 And so the list goes on for about 50 domains. Am I correct to be concerned here and what was the seo plan here? Thanks in advance. Andy Southall. (Marz Ventures)
White Hat / Black Hat SEO | | MarzVentures0 -
What could go wrong? SEO on mobile site is different than desktop site.
We have a desktop site that has been getting worked on over the year regarding improving SEO. Since the mobile site is separate, the business decided to not spend the time to keep it updated and just turned it off. So any mobile user that finds a link to us in search engines, goes to a desktop site that is not responsive. Now that we're hearing Google is going to start incorporating mobile user friendliness into rankings, the business wants to turn the mobile site back on while we spend months making the desktop site responsive. The mobile site basically has no SEO. The title tag is uniform across the site, etc. How much will it hurt us to turn on that SEO horrid mobile site? Or how much will it hurt us to not turn it on?
White Hat / Black Hat SEO | | CFSSEO0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Not Sure Exactly How Penguin is Effecting My Site (looking for tuff love)
Hello Everyone, I own a small web design firm website. My site was for sure hit by Penguin, but not sure why they devalued my pages. I dont really do SEO or link building except for my own site. I normally turn clients down when they want SEO services. Mainly because we do web design really well, so I've chosen to invest my time doing that versus doing both. For most of my keywords I took a 5-10 spot drop. Nothing super major, but enough that its tanking my business. I've survived all of the panda stuff, so I assumed my content writing, and link building tactics were good, but I'm wrong. Here is my website: http://www.clikzy.com/ I will explain what I have been doing below, and I am looking for a little help/direction. I don't mind if you look at my site, links, and are brutally honest. I do have original content ( I just ran everything through copyscape a month prior), and I occasionally get duplicate pages as my CMS tends to create them automatically, but we do a fairly good job of redirecting those pages when they occur. My link building tactics have been a mixture of the following. I have used anchor text alot which could be my demise. Forum postings
White Hat / Black Hat SEO | | Juratovic
Blog Comments
Website Footers (roughly 20 over the course of a year)
Web Design Directories (explain below) I did start using directories in January. I would use OpenSite Explorer, and review some of my bigger competitors, and would occasionally get listed on a web design directory they were on. I think total I probably purchase 8-10 since Jan including botw, business.com, etc. I have some local pages which I create case studies for which ranked fairly well. For instance for the term Baltimore Web Design I used to rank #1, but not I'm 5-6. Here's the keyword: http://www.clikzy.com/reston-web-design.html One more note: I did redirect an older website I have (12 years) to my current site (3 years) in Jan. Mainly because it ranked well for terms in NYC. I've dropped for those terms around 5 spots. I have a few questions: 1. What would you suggest to fix something like this? Power Content? Better Links? (of course, we all want better links) 2. Remove links? I am not sure which ones are bad. One thought was to just kill my local pages (roughly 15 pages) which would also kill any bad links I may have point to them. 3. If I have a bad link pointing to a internal page, could that bring down my home page rankings? Any help would be appreciated. I am in the process of interviewing for a full time PR person to come on staff to help us with our content, and PR. I knew I should have invested in this person a year ago! ha! Thank you! Pete0 -
Somebody hacked many sites and put links to my sites in hidden div
I had 300 good natural links to my site from different sites and site ranked great for my keywords. Somebody (I suppose my competitor) has hacked other sites 2 days ago (checked Google cache) and now Yahoo Site Explorer shows 600 backlinks. I've checked new links - they all are in the same hidden div block - top:-100px; position:absolute;. I'm afraid that Google may penalize my site for these links. I'm contacting webmasters of these sites and their hosting so they remove these links. Is it possible to give Google a notice that these links are not mine so it could just skip them not penalizing me? Is it safe to make "Spam report" regarding links to my own site?
White Hat / Black Hat SEO | | zarades0