Duplicate page content & titles on the same domain
-
Hey,
My website: http://www.electromarket.co.uk is running Magento Enterprise.
The issue I'm running into is that the URLs can be shortened and modified to display different things on the website itself. Here's a few examples.
Product Page URL: http://www.electromarket.co.uk/speakers-audio-equipment/dj-pa-speakers/studio-bedroom-monitors/bba0051
OR I could remove everything in the URL and just have: http://www.electromarket.co.uk/bba0051 and the link will work just as well.
Now my problem is, these two URL's load the same page title, same content, same everything, because essentially they are the very same web page.
But how do I tell Google that? Do I need to tell Google that? And would I benefit by using a redirect for the shorter URLs?
Thanks!
-
Hi Allen,
Thanks for that! Really helpful. I'll look into it right away.
Tom
-
404 errors are definitely not what you want.
You loose any link juice that may have been established for that URL. I believe you should look into a free plug-in available for Magento. It can be found with the search term "Canonical". It is easy to use, and is designed specifically for this issue.
Bing, Yahoo, Google, all follow the Canonical directive, and will solve the link juice issue automatically.
We use a different shopping cart, but have implemented the "Canonical" tag in a slightly different way, with the same result.
If you have multiple pages, each with the complaint from a search engine that they have duplicate content, then you should try and look at the page and think why does the search engine consider this duplicate content. Perhaps you have the same template being used for each of the paginated results. This page then is a great candidate to be "Canonicalized".
Canonicalization does not hurt you search engine rankings, think of it like a downtown metro area. Will your business stand out more if you have two small downtown locations, or will it draw more attention if it is a single building that is twice as tall? When you Canonicalize a url the link juice flows from the duplicate page to the single page specified in the Canonical url.
-
Oh okay, so yeah a lot of 404's! Ha.
The website currently has ~4100 products including product packages (2x 1 speaker etc.)
What I had in mind was using electromarket.co.uk/product code as a URL redirector or forwarder.
It will be a long process and it's going to take a lot of patience I imagine. But for now, having duplicate content (in theory) on the same website, is that going to cause problems with Google? Will Google see that it's the same URL and not take action or will I be penalized for it? That's my main concern at the moment!
Tom
-
No probs.
No, Magento URL Rewrite Module will not do the 301's for a mass change like going from category paths to just product paths - so you will be left with a lot of 404's to clear up. The other thing is that your individual product URL's, have no keywords in them whatsoever, except presumably the SKU's - so I think changing it may affect you adversely.
How many products do you have? 1200 hits a day is not bad going...
-
Hi Ben,
Thanks for that! I shall look into it. I'm a bit wary about turning the category URL off as I am currently getting around 1200 unique hits a day and wouldn't want to jeopardise that by sending potential customers to a 404 page (all be it a soft 404).
I know that if I change a URL to a product, magento set up a URL redirect from the old URL to the new URL. Is this an option with Magento for your URL change suggestion or would I just have to live with the 404's until my customers/previous visitors start to use the new URLs?
Thanks
-
Hi Thomas,
I'm familiar with Magento and familiar with SEO.
If you go to the magento admin panel, and go to System > Configuration > Catalog > Catalog > Search Engine optimization, there is an option for:
Use Categories Path for Product URLs (which you have set to YES right now) and also this: Use Canonical Link Meta Tag For Products - which you may have set to yes o no. Enabling this last option tells Google that Product A is available under different URL's but it's the same page. This is how I had my setup. However, the rel canonical implementation is just a suggestion to search bots, not a directive, and from what I have read, when you have the same page appearing under 4 to 5 different categories, it is really down to Google and other bots to agree with you on the canonical.
So, I decided to set "Use Category Path for Product URL's" to NO - this means that your product will only be viewable under http://www.electromarket.co.uk/bba0051 which gets rid of duplicate content issues - you do lose the keyword rich URL's created by the category paths, but in the end I chose this option.
IMPORTANT: If you change that setting now, it will create a lot of 404's for you, as you probably have those indexed already in Google - this is a decision that in the short term creates a lot of work, but longer term I think solves a lot of problems.
Hope that helps,
Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content pages on different domains, best practice?
Hi, We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity. So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none. Thanks for taking the time in reading this.
Technical SEO | | seosogood0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
Quickview popup duplicate content
Hi We have an eccomerce site. We just added to the product list view a quickview tab - when you roll mouse over it a popup window with the product image and short description shows up - is this a problem of duplicate content( its the same content that's on the product pages except there we also have a long detailed description) - t is done with javascript. Thanks!
Technical SEO | | henya0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Duplicate Page Title for multilingual wordpress site
Hello all, I have received my first crawl reports and I find a lot of errors of duplicate page title. In the wordpress site I use the qtranslate plugin in order to have the site in 2 languages. I also use the Yoast SEO plugin in order to put titles, description and keywords to each web page. By looking deeply in the duplicate page title errors I think I found that the problem is that every web page takes the same SEO Title for each language. But I am not 100% sure. I tried to use some shortcodes of the qtranslate plugin like the following ABOUT [:en]About in order to indicate and give different titles per language for one web page but that doesn't seem to work. Does anybody here has experienced the same problem as me? Do you have any suggestions about how to ressolve the problem of the duplicate page title? I can give you the URL of the website if you need it to have a look. Thank you in advanced for your help. I really appreciate that. Regards, Lenia
Technical SEO | | tevag0 -
Testing for duplicate content and title tags
Hi there, I have been getting both Duplicate Page content and Duplicate Title content warnings on my crawl diagnostics report for one of my campaigns. I did my research, and implemented the preferred domain setting in Webmaster Tools. This did not resolve the crawl diagnostics warnings, and upon further research I discovered the preferred domain would only be noted by Google and not other bots like Roger. My only issue was that when I ran an SEOmoz crawl test on the same domain, I saw none of the duplicate content or title warnings yet they still appear on my crawl diagnostics report. I have now implemented a fix in my .htaccess file to 301 redirect to the www. domain. I want to check if it's worked, but since the crawl test did not show the issue last time I don't think I can rely on that. Can you help please? Thanks, Claire
Technical SEO | | SEOvet0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0