Is this duplicate content?
-
My client has several articles and pages that have 2 different URLs
For example:
is the same article as:
I was not sure if this was duplicate content or not ...
Or if I should be putting "/article.cfm" into the robots.txt file or not..
if anyone could help me out, that would be awesome!
Thanks
-
Agreed - although I think a 301-redirect or canonical tag implementation would probabyl be ok. If there's a database lookup that can translate the DocID into a URL string, the canonical is easy (I write some CF code, so I can at least tell you it's doable). Keep in mind that "article.cfm" is only one template, so if you can find a solution that's data-driven, it's just as easy for 1,000 pages as it is for 10.
You could also create a dynamic 301-redirect via <cfheader>- the core logic is the same. Basically, you look up the URL from the DocID and dynamically create the tag. You just need someone who understands your CMS and data. The actual code is only a few lines, but understanding your setup is the time-consuming part.</cfheader>
-
ATMOS, those are just the same page, so Canonical tag should do it, but also you want to stop google indexing it, so you could detect that it is called with the article.cfm and use a no index META tag too, but not if it uses the friendly url
-
I mostly agree with kchan.
- It is considered duplicate content.
- Simplest way is to do rel canonical for the pages with ids.
However, I suspect 301 redirect is not the best way. Especially, if your website is using Omniture and/or Google web analytics code, you might get miscalculated traffic through them because of 301 redirect.
Be careful if you choose the last route.
-
Awesome Chan, thanks. That was my thought as well. Most difficult part will be determining how to get that script in place.
-
Any chance you can spend a little time writing it out?
My guess is that we should be doing a rel canonical tag on all the article.cfm?intDocID=22572 type pages, that would then direct the bots to our /bc-blazes-construction-trail. But what's the easiest way to do that across the whole site?
-
Hello,
It sure is duplicate content. By putting "/article.cfm" into the robot.txt won't work because if you do that you are just re-directing the whole folder. You need to do a permanent re-direct. I had a brief look at the site and it seems like there are over 1000+ pages. This might take a while but it is neccessary to do it, if not your clients rankings will not perform and most likely penalised. A simple way would be doing a canonical link in /article.cfm?intDocID=22572 so you are showing google the main article is located at /bc-blazes-construction-trail.
However, the best way would be doing a 301 permanent re-direct of course. I'm sure you could get a web dev to write a script to automatically run through the database and output the re-directs than manually re-directing 1000+ pages. If it can't be done, you could outsource it on freelancer.com for around $2-300.
Thanks
-
That would definitely be considered duplicate content. There are a few things you can do to fix it, but rather than wasting a bunch of time writing it out here I would recommend visiting the link below for more detailed info:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on .com .au and .de/europe/en. Would it be wise to move to .com?
This is the scenario: A webstore has evolved into 7 sites in 3 shops: example.com/northamerica example.de/europe example.de/europe/en example.de/europe/fr example.de/europe/es example.de/europe /it example.com.au .com/northamerica .de/europe/en and .com.au all have mostly the same content on them (all 3 are in english). What would be the best way to avoid duplicate content? An answer would be very much appreciated!
Intermediate & Advanced SEO | | SEO-Bas0 -
Handling duplicate content, whilst making both rank well
Hey MOZperts, I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites. URL structure:
Intermediate & Advanced SEO | | relientmark
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet) Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here. My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS. Is this possible? Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one? Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue? Keen to hear your thoughts and if you have any suggestions for how we can handle this best. Thanks in advance!0 -
Scraping / Duplicate Content Question
Hi All, I understanding the way to protect content such as a feature rich article is to create authorship by linking to your Google+ account. My Question
Intermediate & Advanced SEO | | Mark_Ch
You have created a webpage that is informative but not worthy to be an article, hence no need create authorship in Google+
If a competitor comes along and steals this content word for word, something similar, creates their own Google+ page, can you be penalised? Is there any way to protect yourself without authorship and Google+? Regards Mark0 -
Best way to remove duplicate content with categories?
I have duplicate content for all of the products I sell on my website due to categories and subcategories. Ex: http://www.shopgearinc.com/products/product/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/heavy-duty-feeders/stockfeeder-af38.php Above are 3 urls to the same title and content. I use a third party developer backend system so doing canonicalization seems difficult as I don't have full access. What is the best to get rid of this duplicate content. Can I do it through webmaster tools or should I pay the developer to do the canonicalization or a 301 redirect? Any suggestions? Thanks
Intermediate & Advanced SEO | | kysizzle60 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840