Sitemap and Privacy Policy marked for duplicate content?
-
On a recent crawl, Moz flagged a page of our site for duplicate content. However, the pages listed are our sitemap and our privacy policy -- both very different:
http://elearning.smp.org/sitemap/
http://elearning.smp.org/privacy-policy/
What is our best option to address this issue? I had considered a noindex tag on the privacy policy page, but since we have enabled user insights in Google Analytics we need to have the privacy policy displayed and I worry that putting a noindex on the page would cause problems later.
-
Just ignore it, duplicate content is not a real issue. Definitely not in this case. What Moz is looking at is the overlap in code, if the code is for xx% the same they'll mark it as duplicate. That's why it isn't super intelligent, also don't worry about duplicate content and Google itself. Only if you really mess it up, you'll get yourself in trouble.
-
Maybe you should try re-indexing the pages (search console)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm doing a crawl analysis for a website and finding all these duplicate URLs with "null" being added to them and have no clue what could be causing this.
Does anyone know what could be causing this? Our dev team thinks it's caused by mobile pages they created a while ago but it is adding 1000's of additional URLs to the crawl report and being indexed by Google. They don't see it as a priority but I believe these could be very harmful to our site. examples from URL string:
Web Design | | julianne.amann
uruguay-argentina-chilenullnull/days
rainforests-volcanoes-wildlifenullnull/reviews
of-eastern-europenullnullnullnull/hotels0 -
Duplicate Content Issue: Mobile vs. Desktop View
Setting aside my personal issue with Google's favoritism for Responsive websites, which I believe doesn't always provide the best user experience, I have a question regarding duplicate content... I created a section of a Wordpress web page (using Visual Composer) that shows differently on mobile than it does on desktop view. This section has the same content for both views, but is formatted differently to give a better user experience on mobile devices. I did this by creating two different text elements, formatted differently, but containing the same content. The problem is that both sections appear in the source code of the page. According to Google, does that mean I have duplicate content on this page?
Web Design | | Dino640 -
Fixing Render Blocking Javascript and CSS in the Above-the-fold content
We don't have a responsive design site yet, and our mobile site is built through Dudamobile. I know it's not the best, but I'm trying to do whatever we can until we get around to redesigning it. Is there anything I can do about the following Page Speed Insight errors or are they just a function of using Dudamobile? Eliminate render-blocking JavaScript and CSS in above-the-fold content Your page has 3 blocking script resources and 5 blocking CSS resources. This causes a delay in rendering your page.None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.Remove render-blocking JavaScript: http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js http://mobile.dudamobile.com/…ckage.min.js?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…pts/blogs.js?version=2015-04-02T13:36:04 Optimize CSS Delivery of the following: http://fonts.googleapis.com/…:400|Great+Vibes|Signika:400,300,600,700 http://mobile.dudamobile.com/…ont-pack.css?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…kage.min.css?version=2015-04-02T13:36:04 http://irp-cdn.multiscreensite.com/kempruge/files/kempruge_0.min.css?v=6 http://irp-cdn.multiscreensite.com/…mpruge/files/kempruge_home_0.min.css?v=6 Thanks for any tips, Ruben
Web Design | | KempRugeLawGroup0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
Subdomains, duplicate content and microsites
I work for a website that generates a high amount of unique, quality content. This website though has had development issues with our web builder and they are going to separate the site into different subdomains upon launch. It's a scholarly site so the subdomains will be like history and science and stuff. Don't ask why aren't we aren't using subdirectories because trust me I wish we could. So we have to use subdomains and I'm wondering a couple questions. Will the duplication of coding, since all subdomains will have the same design and look, heavily penalize us and is there any way around that? Also if we generate a good amount of high quality content on each site could we link all those sites to our other site as a possible benefit for link building? And finally, would footer links, linking all the subdirectories, be a good thing to put in?
Web Design | | mdorville0 -
How to provide product information without duplicate content?
Hi all, I have an ecommerce website with approx 400 products, these are quite technical products and i use to have helpful information about the products on the product pages... My SEO company told me to remove all this, as i had lots of duplicate content issues... I have since had content writers re-write all product descriptions (about 250 words per product)... and now i am trying to figure out a way of getting the "helpful" information back on but in some kind of dynamic way... There is basically about 5 or 6 blocks of information, that can be added to each product page, these overlap hundreds of products. i was thinking of perhaps creating a separate static page for each block of useful information, and putting links on the product pages to this... however, ideally i would prefer to not keep sending customers to other pages... so wanted to see if others had come across similar issues themselves and how they went about having this "content" available to the user but in such a way it was not duplicate content... Please note using images would not be any good here, as the content varies in size but most of it is text based... regards James
Web Design | | isntworkdull1 -
Penalized by duplicate content?
Hello, I am in a very weird position. I am managing a website(EMD) which a part of it dynamically creates pages. The former webmaster who create this system though that this would help with SEO but I dought! The thing is that now the site has about 1500 pages which must look duplicate but are they really duplicate? Each page has a unique URL but the content is pretty much the same: one image and a different title with 5-8 words. There is more: All these pages are not accessible by the users but only for the crawlers!!! This URL machine is a part of a php - made photo gallery which i never understood the sense of it! The site overall is not performing very well in SERP, especially after Penguin, but judging by the link profile, the Domain authority, construction (ok besides that crazy photo gallery) and content, it never reached the position it should have in the past. The majority of these mysterious pages - and mostly their images - are cached by Google and some of them are in top places to some SERP - the ones that match the small title on page - but the numbers are poor, 10 - 15 clicks per month. Are these pages considered as duplicated, although they are cached, and is it safe for the site just to remove 1500 at once? The seomoz tools have pointed some of them as dups but the majority not! Can these pages impact the image of the whole site in search engines?( drop in Google and has disappeared from Yahoo and Bing!) Do I also have to tell Google about the removal? I have not seen anything like it before so any comment would be helpful! Thank you!
Web Design | | Tz_Seo0 -
Alternatives to Wordpress for updating content of a static html site
I have a static html site which I cannot update myself. What solutions/ programs would you recommend for gaining the ability to update it myself? I'm reluctant to switch to WordPress because the sites that use any CMS that are hosted by my web hosting company get routinely hacked. Thank you!
Web Design | | translate0