Www.colourbanners.co.uk/ & colourbanners.co.uk showing up as two seperate URLs - is this going to be dupliacte content issue?
-
Hi Guys,
I have just created a report in Moz and there appears to be 91 duplicate content issues with the site which i need to fix as i think it could be the reason why we are suffering from a penalty.
One of the main questions i have is these 3 variations of the URL
http://www.colourbanners.co.uk/ http://colourbanners.co.ukhttp://colourbanners.co.uk/Each have links pointing to them. My question is, could this be causing a dupe issue?regardsGerry
-
in essence it tells Google it is that page so removes it from the index
-
many thanks Geoff,
so by putting <link rel="<a rel="nofollow" target="_blank">canonical</a>" href="http://www.colourbanners.co.uk"> in the head of
http://www.colourbanners.co.uk/index.htm tells Google that the latter is a dupe?
Just want to make sure i have it right,
Cheers
-
Take a look in Majestic SEO
It may also be in your webmaster tools
-
HI Chris,When looking at on-site explorer i only get 472 links in total? im confused now
regards
Gerry
-
Chris is right. The issue is that you won't have control of which version gets in the results - which one people link to or share via social media. You won't get a penalty but you would be best to get rid of the duplicates using 301s and rel=canonical - this will get the issue sorted the quickest and the page will get the maximum benefit from shares and links.
be careful with rel=canonical
your homepage would be: <link rel="<a rel="nofollow" target="_blank">canonical</a>" href="http://www.colourbanners.co.uk">
then lower page 1: <link rel="<a rel="nofollow" target="_blank">canonical</a>" href="http://www.colourbanners.co.uk/lower-page-1.htm">
then lower page 2: <link rel="<a rel="nofollow" target="_blank">canonical</a>" href="http://www.colourbanners.co.uk/lower-page-2.htm">
do not put the same tag across all pages
-
All though maybe a factor I doubt its the larger reason, I'd look into your profile further you've got 15k of links from gfoods.org.uk which isn't going to be helping you.
It also depends how competitive your terms are.
-
thanks.
I will apply the rel=cal too. The site www.colourbanners.co.uk dropped off the face of the earth for many of its key terms,
Although still ranking from the brand name.
No manual actions in WMT and the link profile is quite natural (brand name is the majority)
Could this dupe content be the key?
Your opinions and thoughts are welcome
-
Can take a while to deinxed though the 301 will work before then. I wouldn't worry about the duplicate content so much Google pretty cleaver, having said that rel=canonical won't hurt you either.
"“I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing." - Matt Cutts
*Edit Yes Gez that would then go in the head of each page
-
Hi geoff,
thanks for your response. So a 301 may not be enough?
I want al pages with the www. to be the origional pages so would i apply the following to any duplicated pages;
<link rel="<a rel="nofollow" target="_blank">canonical</a>" href="http:www.colourbanners.co.uk">
(i.e. thi tag would be added to the other dupe home pages)??
your help is much appreciated
-
With Google it depends on a range of factors. The URLs will be removed from the index once Google crawls them again.
If I was you I would still implement <link rel="<a class="attribute-value">canonical</a>" href="full URL of page"> in the head section - note the URL should be the www version of the page and must change to the URL of the page you are on.
Have no idea how long Moz takes.
-
Thanks for the advice everyone
I 301 redirected alot of pages as it appeared that each page of the site had 2 URL variations.
Will Google auto de-index these duplicated URLs? How long does it take? I only just put the redirect on but the duplicate issue is still showing as an issue in MOZ (when does MOZ update?)regards,
Gerry
-
you'll need to 301 redirect http://www.colourbanners.co.uk/index.htm and http://www.colourbanners.co.uk/index.html to http://www.colourbanners.co.uk/ to avoid duplicate homepages
-
No they are not, if you rel=canonical on the main one it will help solve any issues, you shouldn't redirect into http://abc.com/ rather the other way but that is preference.
in short rel=canonical will stop any duplicate content issues.
-
Good question; I was wondering the same if http://www.abc.com/ and http://www.abc.com the same?
When I use the tool site explorer it displays results from http://www.abc.com/ but results are generated for http://www.abc.com as well..
Kinldy let me know how to solve this issue?A simple redirect of http://www.abc.com to http://abc.com/ will solve the issue?
"http://www.abc.com is just an example"
-
Hi Gerry,
Best practice is to redirect into one main site so lets say your main side is http://www.colourbanners.co.uk/
We would then redirect http://colourbanners.co.uk into the other (or visa versa) and yes it can cause duplicate content issues, you can also as a safety precaution set up rel=canonical on the main page too.
~More info here - https://support.google.com/webmasters/answer/44231?hl=en
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I change PDF content?
Hi everybody, My Website is ranking well for several keywords and long-tail keywords. However, all these visits are going directly to some .PDF guides that exist on our products and information on industry sectors the company is based around. I feel the PDF's are bad simply because they dont offer easy interaction with the rest of the website. I am considering making each PDF into a webpage but am not 100% sure of the pro's and cons of doing so. I will still need to the PDF's accessible for user to download but don't want my new webpages to get tagged as duplicate content. Is it possible to,
On-Page Optimization | | ATP
1 - change the PDF's so they send any link authority to the new webpage
2 - make google aware that I want the webpage not the PDF to be the "ranking" page What is the likely hood of destroying my rank for these keywords on the PDF by making these changes and then not being able to rank the webpage for the same keywords? It would be pointless if I just lost all the traffic lol.0 -
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
What is the best way to resolve duplicate content issue
Hi I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere. My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content? I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply. I've also considered rewriting our content. Again this takes time. Has anybody experienced this issue before? If so how did you come to a solution? Thanks in advance.
On-Page Optimization | | sicseo0 -
Duplicate content problem
I am having an issue with duplicate content that I can't seem to figure out. I got rid of the www.mydomain.com by modifying the htaccess file but I can't figure out how to fix theproblem of mydomain.com/ and mydomain.com
On-Page Optimization | | ayetti0 -
How do I get this program to see url with www. and with out www the same
The program is showing pages with www. as a differant page from a page with out the www. first, this is showing up as duplicate pages when they are the same page, how do I filter this?
On-Page Optimization | | masterplumbertom0 -
How should I structure my product URLs?
How should I structure my product URLs for the best SEO results? Lets say my product is "American Apparel 2001". Would it be better to run the URL together or use a "-" between each word. Here are two ideas I had, but feel free to suggest others. realthread.com/products/americanapparel2001 or realthread.com/products/american-apparel-2001 Thanks for the help! Dru
On-Page Optimization | | drudalton0 -
Should H1s be used in the logo? If they are and it is dynamic on each page to relate to the page content, is this detrimental to the site rather than having it in the page content?
On some sites, the H1 is contained within the logo and remains consistent throughout the site (i.e. the company name is in the of the logo). If the h1 in a logo is dynamic for each page (i.e. on the homepage it is company name - homepage) is this better or worse to have it changed out on the logo rather than having it in the page content?
On-Page Optimization | | CabbageTree0