Minimising duplicate content
-
From a minimising duplicate content perspective is it best to create all blog posts with a single tag so google doesn't think the same post being returned via a different tag search is duplicate content. I.e. the urls below return the same blog post; or doesn't it matter.
for example
http://www.ukholidayplaces.co.uk/blog/?tag=/stay+in+Margate
http://www.ukholidayplaces.co.uk/blog/?tag=/Margate+on+a+budget
are the same posts...
thanks
-
Hi!
Little late to the party here - thanks Geoff for helping out!!
While certainly creating excerpts on for the tag pages would be great - I'd suggest doing a crawl of your own site with something like Screaming Frog SEO Spider
I just did a crawl, and see a bunch of issues needing attention:
- Just about all of your meta descriptions are exactly the same
- Your H1s are all the same
- Bunch of duplicate titles (because for example, all the author archive subpages are being given the same title)
- I don't see any meta robots or canonical tags in use at all, which would be good to help control what pages you want indexed or counted for value.
- You have tons of meta keywords, mostly all duplicates, and the meta keywords tag should not be used anymore.
You've got some additional issues to work out besides just the tags thing.
Check webmaster tools to confirm this as well, Google webmaster tools will show you everything you need to fix!
-Dan
-
You're welcome Jonathan.
Feel free to see how a lot of other successful organisations implement this on their blogs on the web. Take Mashable for example, see their topics pages, these are essentially what blog articles are tagged with. Looks like they cut off their snippets at about 170 characters.
Also, ensure that you're using the canonical link element for blog article pages too to let search engines know that those are the originals and where you want the weight placed.
-
Thanks Geoff,
I wasn't sure after the recent updates.
Copy scape finds loads of matches but google didn't....
-
No, assigning multiple tags to multiple pages on your website is good practice. (Providing they are of relevance of course).
What you should think about doing is only displaying excerpts for tag / search result pages so that it doesn't flag as duplicate content. You don't need to be displaying the entire post(s) for a tag page, a small snippet with a 'Read More' or similar link will ensure the full original is only ever at one location, it's specific URI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing content and design of the website gonna affect my all the backlinks i have made till now
i have been working on my link profile for a month now, after learning about 5 step moz methodology i have decided that i would like to change all of the content of my site and taylor it to what my customers need, am i gonna loose all the domain authority if make changes? if it gonna affect, hows that gonna come out
Web Design | | calvinkj0 -
Fixing Render Blocking Javascript and CSS in the Above-the-fold content
We don't have a responsive design site yet, and our mobile site is built through Dudamobile. I know it's not the best, but I'm trying to do whatever we can until we get around to redesigning it. Is there anything I can do about the following Page Speed Insight errors or are they just a function of using Dudamobile? Eliminate render-blocking JavaScript and CSS in above-the-fold content Your page has 3 blocking script resources and 5 blocking CSS resources. This causes a delay in rendering your page.None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.Remove render-blocking JavaScript: http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js http://mobile.dudamobile.com/…ckage.min.js?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…pts/blogs.js?version=2015-04-02T13:36:04 Optimize CSS Delivery of the following: http://fonts.googleapis.com/…:400|Great+Vibes|Signika:400,300,600,700 http://mobile.dudamobile.com/…ont-pack.css?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…kage.min.css?version=2015-04-02T13:36:04 http://irp-cdn.multiscreensite.com/kempruge/files/kempruge_0.min.css?v=6 http://irp-cdn.multiscreensite.com/…mpruge/files/kempruge_home_0.min.css?v=6 Thanks for any tips, Ruben
Web Design | | KempRugeLawGroup0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging. And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing. Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update). We noticed traffic taking a particular plunge at the beginning of June. Can anyone offer insights? Very much appreciated.
Web Design | | Novos_Jay0 -
Which Content Causes Duplicate Content Errors
My Duplicate Content list starts off with this URL: http://www.nebraskamed.com/about-us/branding/bellevue-medical-center-logo Then it lists the five below as Duplicate Content: http://www.nebraskamed.com/about-us/branding/fonts http://www.nebraskamed.com/about-us/branding/clear-zone http://www.nebraskamed.com/about-us/social-media http://www.nebraskamed.com/about-us/branding/order-stationery http://www.nebraskamed.com/about-us/branding/logo I do notice that most of these pages have images and/or little or no content outside of our sites template. Is this causing SEOmoz to see it as duplicate? Should I use noindex, follows to fix this? This error is happening with branding pages so noindex is an option. What should I do if that's not an option? Should I change our mega menus to be ajax driven to so the links aren't showing up in the code of every page?
Web Design | | Patrick_at_Nebraska_Medicine0 -
Do these links count a duplicate content?
If you do a Google search for the following term it brings up 6 results are these considered duplicate content by Google? Also if so how do I prevent this but still offer other stories to readers of other articles? Google Search Term: site:yakangler.com Okuma helios
Web Design | | mr_w0 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1 -
Are my duplicate meta titles and descriptions an issue ?
HelloMy website http://www.gardenbeet.com has been rebuilt using prestacart and there are 158 duplicate title and meta descriptions being reported by google.My developer advised the following Almost all the duplicates are due to the same page being accessible at the root and following the category heading. e.g; /75-vegetable-patio-planter-turquoise.html
Web Design | | GardenBeet
/patio-planters/75-vegetable-patio-planter-turquoise.html This is hard-wired into PrestaShop. Was the Canonical module (now disabled) responsible for the confusion by not including the category name? The Googlebot shouldn't be scanning the root versions now. I don't believe this to be a serious issue but I'd recommend a second opinion from someone more SEO savvy just to be sure.Opinions??0