Get duplicate content error in seomoz even after fixed canonical
-
http://www.webworld.no - We have been getting error Page Duplicate Content Errors. But we have fixed the canonical. Actually, we have list of portfolio and detailed page for each portfolio But We get error all portfolio pages are having duplicate content. But, i made canonical tag to direct to root page. Please help me to overcome this.
Also, i see duplicae webworld.no & www.webworld.no. is there anything i need to fix in redirection? in server?
-
Hi Robert
I think the problem with your portfolio is that there is not enough unique content on the page that Google can read. Let's look at these two examples:
http://www.webworld.no/referanser/nordstrand-kiropraktorklinikk_14/#topp_
http://www.webworld.no/referanser/r%C3%B8d-eiendom_13/#topp_Remember, Google doesn't "see" the images as we seem them, so for all intents and purposes it will be seeing these two pages as nearly identical, hence a potential duplicate content issue. You can check how your page looks in Google's eyes with this SEO Browser.
As for the webworld.no & www.webworld.no issue, what's basically happening is that both URLs are being indexed, therefore they are technically duplicates. For homepages, I recommend 301 redirecting one of the URLs to the other. Decide how you'd like your homepage to be seen and then redirect the old URL to the new one. SEOMoz also provides this handy redirection guide.
Hope this helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Will having duplicate content on four websites cause a problem?
A client of ours has four websites for different shops they run in the surrounding area. Each website has original content as well as duplicate content. This is for things like product advice which needs to be the same Will having duplicate content on these four websites cause a problem? How can it be mitigated? We can't refer the visitor to another website to get the product information as this will break the user experience, and of course shopping cart sessions will not pass on.
Content Development | | Rebecca.Holloway0 -
Could posting on YouMoz get you penalized for "Guest Blogging?"
From my understanding, Matt Cutts hates guest blogging, so I told all of the attorneys here not to write anywhere but on our blog. However, I realized people are constantly "guest blogging' on Moz, and considering how smart these people are, it must not be hurting them or they wouldn't do it. However, what I don't understand is why? Yes, I do get that the quality of what's on YouMoz is high and not spammy, but I got the impression that didn't really matter. Guest blogging would get you into trouble no matter what. Can someone clarify for me? Thanks, Ruben
Content Development | | KempRugeLawGroup4 -
Is this duplicate content?
I'm optimizing a Magento site and have a question regarding duplicate content. Currently, you can dig down to an individual product listings with URLs similar to this: (1) http://www.foo.com/category/sub-category/sub-sub-category/item.html However, we also have a "Top 50" area, with a link to the same page; however, the URL for that page is: (2) http://www.foo.com/item.html Both are dynamic, so a static page for (2) with different content is out of the question. I asked IT to have both (1) and (2) point to exactly the same page, within the same categor(ies), but they said I would have choose one or the other So, here are my questions: Will Google consider the pages to be duplicates of each other, and thus incur a penalty; If I were to choose one structure, which would be the "friendliest?" I've think I've come across questions similar to this in Q&A, but haven't been able to locate them; so, I'm sorry to be posting a "duplicate question." I've been busy writing completely different product descriptions, nice and deep and value-rich, for more than 300 items and categories and am only now starting to look at current SEO protocols; I'm hoping to ask Google for a site reevaluation in another 2 weeks or so. Thanks.
Content Development | | RScime250 -
How to sort out 4XX (Client Error) in wordpress
I have 1787 4XX (Client Error) in my wordpress blog and I have no idea how to get rid of them ? Can someone please help
Content Development | | afrika1110 -
Marking our content as original, where the rel=author tag might not be applied
Hello, Can anyone tell, if it is possible to protect text –type content without the rel=author tag? We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased. My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google. The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:) What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10. Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own. So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done. (Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.) Thank you in advance for all of you, sharing your thoughts with me on the topic.
Content Development | | Dilbak0 -
How can I rank using translated content?
My friend has a website with similar content to mine, in a different language however. He has allowed me to translate his content if I link to it every post (can be nofollow). Does Google penalize me for clearly translated content? How can I make sure it ranks well? BTW, if I convince him that I don't link to him, is it better SEO-wise? Best,
Content Development | | kikocherman
Cherman0 -
Duplicate Terms of Use and Privacy Policy, is it a problem?
Hi, If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner? Regards
Content Development | | IM_Learner0