Preventing CNAME Site Duplications
-
Hello fellow mozzers!
Let me see if I can explain this properly.
First, our server admin is out of contact at the moment,
so we are having to take this project on somewhat blind. (forgive the ignorance of terms).We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end.My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out.Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily.Lastly, is an A record required for this type of situation in any way?
Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful! -
It is pointing to the other server now. We have it blocked from indexing on that end, just wanted to make sure that was enough.
-
No,
it is because you are pointing sales to a different sever, it seems to me that you don't have your dns set up correctly. you don't want sales pointing to your main website. -
So does this work better because Google will not show an IP address in search results?
-
You need to point your cname to the ip of the server that hosts your sales.domain.com
don't
Do
sales.domain.com > 123.123.123.123
where 123.123.123.123 is the ip of the hosting webserver.
-
Hello David,
I think with the robots rule (there are many examples out there) should be more than enough in your case! take a look at this helpful article: http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt
I hope that was helpful! Sorry about my English... I'm Spanish
Luis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are links on sites that require PAD files good or bad for SEO?
I want to list our product on a number of sites that require PAD files such as Software Informer and Softpedia. Is this a good idea from an SEO perspective to have links on these pages?
White Hat / Black Hat SEO | | SnapComms0 -
Wordpress Category Archives - Index - but will this cause duplication?
Okay something I am struggling with Using YOAST - but have a recipe blog - However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below. My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
White Hat / Black Hat SEO | | Kelly33300 -
Duplicate categories how to make sure I don't get penalized for this
Hi there How would I go about fixing duplicate categories? My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
White Hat / Black Hat SEO | | edward-may0 -
Duplicate keywords in URL?
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/. Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
White Hat / Black Hat SEO | | CFSSEO0 -
I think My Site Has Been Hacked
I am working with a client and have noticed lots of 500 server errors that look very strange in their webmaster tools account. I am seeing URLs like this blog/?tag=wholesale-cheap-nfl-jerseys-free-0702.html and blog/?tag=nike-jersey-shorts-4297.html there are 155 similar pages yet the client does not sell anything like this and hasn't created these URLs. I have updated WP and all plugins and cannot find these links or pages on the site anywhere but I am guessing they are slowing the site down as GWT keeps highlighting them as errors. Has anybody had any experiences with these types of hacks and can point me in the right direction of how to clean it up properly? Ta
White Hat / Black Hat SEO | | fazza470 -
Site that's 301 redirected is ranking for brand
We own a number of foreign TLD domains for our brand. They are all 301-redirected to our main .com branded domain. One of them is appearing in our branded search results, outranking out main .com page. To be clear, this is despite there being a 301 redirect from it to the .com page. Any ideas on what is going on here?
White Hat / Black Hat SEO | | ipancake0 -
The purpose of these Algo updates: To more harshly push eCommerce sites toward PPC and enable normal blogs/forums toward reclaiming organic search positions?
Hi everyone, This is my first post here, and absolutely loving the site and the services. Just a quick background, I have dabbled in SEO in the past, and have been reading up over the last few months and am amazed at the speed at which things are changing. I currently have a few clients that I am doing some SEO work for 2 of them, and have had an ecommerce site enquire about SEO services. They are a medium sized oak furniture ecommerce site. From all the major changes..the devaluing of spam links, link networks, penalization of overuse of exact match anchor text and the overall encouraging of earned links (often via content marketing) over built links, adding to this the (not provided) section in Google Analytics, and the increasing screen real estate that PPC is getting over organic search...all points to me thinking on major thing..... That the search engine is trying to push eCommerce sites and sites that sell stuff harder toward using PPC and paid advertising and allowing the blogs/forums and informational sites to more easily reclaim the organic part of the search results again. The above is elaborated on a bit more below.. POINT 1 Firstly as built links (article submission, press releases, info graphic submission, web 2.0 link building ect) rapidly lose their effectiveness, and as Google starts to place more emphasis on sites earning links instead - by producing amazing interesting and unique content that people want to link to. The fact remains that surely Google is aware that it is much harder for eCommerce sites to produce a constant stream of interesting link worthy content around their niche (especially if its a niche that not an awful lot could be written about). Although earning links is not impossible for eCommerce sites, for a lot of them it is more difficult because creating link worthy content is not what eCommerce sites were originally intended for. Whereas standard blogs and forums were built for that exact purpose. Therefore the search engines must know that it is a lot easier for normal blogs/forums to "earn" links through content, therefore leading to them reclaiming more of the organic search ranking for transaction and non transaction terms, and therefore forcing the eCommerce sites to adopt PPC more heavily. POINT 2 If we add to the mix the fact that for the terms most relevant to eCommerce sites, the search engine results page has a larger allocation of PPC ads than organic results (above the fold), and that Google has limited the amount of data that sites can see in terms of which keywords people are using to arrive on their sites, which effects eCommerce sites more - as it makes it harder for them to see which keywords are resulting in sales. Then this provides further evidence that Google is trying to back eCommerce sites into a corner by making it more difficult for them to make sense of and track sales from organic results in comparison to with PPC, where data is still plentiful. Conclusion Are the above just over exaggerations? Can most eCommerce sites still keep achieving a good percentage of sales from organic search despite the above? if so, what do the more niche eCommerce sites do to "earn" links when content topics are thin and unique outreach destinations can be exhausted quickly. Do they accept the fact that the are in the business of selling things, so should be paying for their traffic as opposed to normal blogs/forums which are not. Or is there still a place for them to get even more creative with content and acquire earned links..? And finally, is the concentration on earned links more overplayed than it actually is? Id really appreciate your thoughts on this..
White Hat / Black Hat SEO | | sanj50500 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0