My pages says it has 16 errors, need help
-
My pages says it has 16 errors, and all of them are due to duplicate content. How do I fix this? I believe its only due to my meta tag description.
-
Glad to help
Hope it's an easy fix.
-
and apparently in unison!
-
You are very welcome and good luck!
-
HAHA! What can I say other then great minds think alike!
-
Thank you both! I will look into Dr. Pete guide, pronto!
Cheers!
-
Hey Jake,
SNAP!
Sha
-
Hi Gajendra,
The Pro App identifies two types of duplicate Errors (the red button in your Crawl Diagnostics Summary). These are Duplicate Page Content **(**a significant amount of content on the page has been identified as duplicate) and Duplicate Page Title (page title only has been identified as duplicate).
Duplicate Page Title errors are most often an internal issue, where many pages in the site have been given the same page title.
Duplicate Page Content errors can be either an internal and/or external issue. It may be that identical pages within the site are visible via multiple URL's, AND/OR the content on pages may be a duplicate of content on other websites. This happens a lot with sites that use product descriptions, content feeds etc from other sites.
To identify the actual URLs where duplicated content has been detected, click the Red error button in the Diagnostics Summary and you will see a list of pages where the error has been identified. In the second column, you will see a blue link which tells you how many duplicates there are for that particular item. When you click the link you will see a full list of URLs which are duplicates.
There are a number of things which can cause duplicate content errors on a site - many are due to the way the site is structured or functions. To really understand what is happening and how to deal with it, you should read Dr Pete Meyers' landmark post Duplicate Content in a Post-Panda World.
Hope that helps,
Sha
-
My advice would be to dive deep into the campaign that you have running for your page and check to see what is causing the issue. It could be that your URL isn't refined resulting in there being a copy for the page for each URL variation. This can be solved a few different ways but unfortunately I am not quite sure what the problem is that is causing the duplicate content with the information you have provided. Dr. Pete has put together a fantastic Duplicate Content Guide I would recommend checking out . He goes over each variation and some great ways to deal with duplicate content issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Indexed Page Count vs Site:Search Operator page count
We launched a new site and Google Search Console is showing 39 pages have been indexed. When I perform a Site:myurl.com search I see over 100 pages that appear to be indexed. Which is correct and why is there a discrepancy? Also, Search Console Page Index count started at 39 pages on 5/21 and has not increased even though we have hundreds of pages to index. But I do see more results each week from Site:psglearning.com My site is https://wwww.psglearning.com
Technical SEO | | pdowling0 -
Moving Some Content From Page A to Page B
Page A has written content, pictures, videos. The written content from Page A is being moved to Page B. When Google crawls the pages next time around will Page B receive the content credit? Will there not be any issues that this content originally belonged to Page A? Page A is not a page I want to rank for (just have great pictures and videos for users). Can I 301 redirect from Page A to B since the written content from A has been deleted or no need? Again, I intent to keep Page A live because good value for users to see pictures and videos.
Technical SEO | | khi50 -
What if 404 Error not possible?
Hi Everyone, I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error. It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon. I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired). Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough? Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page? Many thanks in advance for your help, Best Regards, Daniel
Technical SEO | | te_c0 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
RegEx help needed for robots.txt potential conflict
I've created a robots.txt file for a new Magento install and used an existing site-map that was on the Magento help forums but the trouble is I can't decipher something. It seems that I am allowing and disallowing access to the same expression for pagination. My robots.txt file (and a lot of other Magento site-maps it seems) includes both: Allow: /*?p= and Disallow: /?p=& I've searched for help on RegEx and I can't see what "&" does but it seems to me that I'm allowing crawler access to all pagination URLs, but then possibly disallowing access to all pagination URLs that include anything other than just the page number? I've looked at several resources and there is practically no reference to what "&" does... Can anyone shed any light on this, to ensure I am allowing suitable access to a shop? Thanks in advance for any assistance
Technical SEO | | MSTJames0 -
Index page
To the SEO experts, this may well seem a silly question, so I apologies in advance as I try not to ask questions that I probably know the answer for already, but clarity is my goal I have numerous sites ,as standard practice, through the .htaccess I will always set up non www to www, and redirect the index page to www.mysite.com. All straight forward, have never questioned this practice, always been advised its the ebst practice to avoid duplicate content. Now, today, I was looking at a CMS service for a customer for their website, the website is already built and its a static website, so the CMS integration was going to mean a full rewrite of the website. Speaking to a friend on another forum, he told me about a service called simple CMS, had a look, looks perfect for the customer ... Went to set it up on the clients site and here is the problem. For the CMS software to work, it MUST access the index page, because my index page is redirected to www.mysite.com , it wont work as it cant find the index page (obviously) I questioned this with the software company, they inform me that it must access the index page, I have explained that it wont be able to and why (cause I have my index page redirected to avoid duplicate content) To my astonishment, the person there told me that duplicate content is a huge no no with Google (that's not the astonishing part) but its not relevant to the index and non index page of a website. This goes against everything I thought I knew ... The person also reassured me that they have worked within the SEO area for 10 years. As I am a subscriber to SEO MOZ and no one here has anything to gain but offering advice, is this true ? Will it not be an issue for duplicate content to show both a index page and non index page ?, will search engines not view this as duplicate content ? Or is this SEO expert talking bull, which I suspect, but cannot be sure. Any advice would be greatly appreciated, it would make my life a lot easier for the customer to use this CMS software, but I would do it at the risk of tarnishing the work they and I have done on their ranking status Many thanks in advance John
Technical SEO | | Johnny4B0 -
Help needed regarding managing client expectations - tricky situation
I will try to explain the scenario as quickly as possible and my hope is that someone can share their opinion on how they would move forward. I was introduced to a local business owner who said he wanted help with SEO. Upon looking at his current online marketing, I saw he had 2 current sites promoting the same local business (martial arts instruction / classes). Why he had two sites? He said it made it easier for him to dominate in Google. Red Flag #1. Upon doing a quick site audit, I found a ton of problems with the existing site. Black text on a black background, keyword stuffing in title tags, non-canonicalization, no xml sitemap, no Google analytics installed...on and on. In addition, the site did not really have a good look to it graphically. I told him that I recommend a fresh new site using Wordpress and that we should build the content with the focus on explaining the benefits of the classes. He agreed and we began development of a new Wordpress site from the ground up. We built a sitemap, wireframe, nice design, etc. The site looks much better and we got rid of a lot of the technical problems with the site. The problem is this: Even though the new site is technically better based on On Page analysis, it is not showing up anywhere in the Top for localized keywords. The site has been live for about 2 1/2 months (March 1). I made the mistake of telling him that in a lot of cases in the past, I was able to build a new site for other clients that would rank well for localized searches based on On Page optimization alone. This is not happening for him with the new site. The new domain is relatively new (less than a year old) and has no links at all at this point. I recommended that we do a 301 redirect from his existing domain to the new one but he is skeptical and I almost can't blame him. The client is not paying me to do any SEO. The contract was to build a new site that would be built with best SEO On Page practices (Title Tags, Header Tags, Meta Desc., XML Sitemap, canonicalization, etc.) I hesitate to post the links to his existing site and the new one we built but I can see where that may shed some more light on the subject. If interested in taking a look, please send me a message. I guess the two questions are this: 1. Is it reasonable for a site to rank well for a localized non-competitive term based on A scores of on page analysis? 2. What harm or foul is there in doing a 301 redirect from the old domain to the new one and then reverting back if he decides that the move hurt his rankings more than helped? Thanks.
Technical SEO | | bluelynxmarketing0 -
Why is 4XX (Client Error) shown for valid pages?
My Crawl Diagnostics Summary says I have 5,141 errors of the 4XX (Client Error) variety. Yet when I view the list of URLs they all resolve to valid pages. Here is an example.
Technical SEO | | jimaycock
http://www.ryderfleetproducts.com/ryder/af/ryder/core/content/product/srm/key/ACO 3018/pn/Wiper-Blade-Winter-18-Each/erm/productDetail.do These pages are all dynamically created from search or browse using a database where we offer 36,000 products. Can someone help me understand why these are errors.0