Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content on Event Pages
-
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
-
A penalty is something google will have to manually remove and you will be able to see that in webmaster tools. A devaluation is when you are adjusted by the algorithm and lowered as a result because each thing that google does not like acts as points against you but you can quickly change and see your results return. Does that make sense?
-
We decided that it was worth a large investment as we would own the content ourselves and not worry in the future about anyone claiming ownership to the content as google gets stricter. So we re wrote half a million words!
-
Also could you fully explain the difference between devaluation and a penalty?
-
Do you mind if I ask how much of the content you re-wrote? My main fear is the amount of work that this would take since a lot of content goes up on the site daily. If the content is re-written did you do the same amount of content or did you re-write your office space listings with less content?
-
This is a Panda issue.
Google has said many times with affiliate sites that use the same content that if they do a better job than the original site it will rank them. So its not all bad when you look at it from that point of view.
However, Google loves unique content and will do its best to rank sites first that have the unique content. I have a business in the office space industry and a few years back we used to aggregate office apace listings which were shared amongst 30+ sites. The display of these listings would be different for many searches but the content was the same as all the other sites. This slowly put us in a PANDA DEVALUATION (there is no panda penalty).
After re-writing them with our clients we saw a significant change once the content had be re-crawled.
So it can have a great effect. If Google starts to see that large parts of your site are duplicate content it will start to question the authority you have in your industry.
Could you offer and incentive to your customers to write something unique? And also maybe inform your users not to copy and paste their own content on your site as this could affect them negatively in Google?
If you are an authority could you tell users that if you want to be listed it must be unique? Or if its a paid service have an ad on service for a few bucks where you write a professional description? Might become a nice additional income?
Just a few ideas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
How do I fix duplicate page issue on Shopify with duplicate products because of collections.
I'm working with a new client with a site built on Shopify. Most of their products appear in four collections. This is creating a duplicate content challenge for us. Can anyone suggest specific code to add to resolve this problem. I'm also interested in other ideas solutions, such as "don't use collections" if that's the best approach. I appreciate your insights. Thank you!
On-Page Optimization | | quiltedkoala0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Add content as blog post or to product pages?
Hi, We have around 40 products which we can produce plenty of in-depth and detailed "how to"-type pieces of content for. Our current plan is to produce a "How to make" style post for each as a long blog post, then link that to the product page. There's probably half a dozen or more of these kind of blog posts that we could do for each product. The reason why we planned on doing it like this is that it would give us plenty of extra pages (blog posts) on their own URL which can be indexed and rank for long tail keywords, but also that we can mention these posts in our newsletter. It'd give people a new page full of specific content that they can read instead of us having to say "Hey! We've updated our product page for X!", which seems a little pointless. Most of the products we sell don't get very many searches themselves; Most get a couple dozen and the odd few get 100-300 each, while one gets more than 2,000 per month. The products don't get many searches as it's a relatively unknown niche when it comes to details, but searches for the "categories" these products are in are very well known (Some broad terms that cover the niche get more than 30,000+ searches a month in the UK and 100,000+ world wide) [Exact].
On-Page Optimization | | azu25
Regarding the one product with more than 2,000 searches; This keyword is both the name of the product and also a name for the category page. Many of our competitors have just one of these products, whereas we're one of the first to have more than 6 variations of this product, thus the category page is acting like our other product pages and the information you would usually find on our product pages, is on the category page for just this product. I'm still leaning towards creating each piece of content as it's own blog post which links to the product pages, while the product pages link to the relevant blog posts, but i'm starting to think that it may be be better to put all the content on the product pages themselves). The only problem with this is that it cuts out on more than 200 very indepth and long blog posts (which due to the amount of content, videos and potentially dozens of high resolution images may slow down the loading of the product pages). From what I can see, here are the pros and cons: Pro (For blog posts):
1. More than 200 blog posts (potentially 1000+ words each with dozens of photos and potentially a video)..
2. More pages to crawl, index and rank..
3. More pages to post on social media..
4. Able to comment about the posts in the newsletter - Sounds more unique than "We've just updated this product page"..
5. Commenting is available on blog posts, whereas it is not on product pages..
6. So much information could slow down the loading of product pages significantly..
7. Some products are very similar (ie, the same product but "better quality" - Difficult to explain without giving the niche away, which i'd prefer not to do ATM) and this would mean the same content isn't on multiple pages.
8. By my understanding, this would be better for Google Authorship/Publishership.. Con (Against blog posts. For extended product pages):
1. Customers have all information in one place and don't have to click on a "Related Blog posts" tab..
2. More content means better ability to rank for product related keywords (All but a few receive very few searches per month, but the niche is exploding at an amazing rate at the moment)..
3. Very little chance of a blog post out-ranking the related product page for keywords.. I've run out of ideas for the 'Con' side of things, but that's why I'd like opinions from someone here if possible. I'd really appreciate any and all input, Thanks! [EDIT]:
I should add that there will be a small "How to make" style section on product pages anyway, which covers the most common step by step instructions. In the content we planned for blog posts, we'd explore the regular method in greater detail and several other methods in good detail. Our products can be "made" in several different ways which each result in a unique end result (some people may prefer it one way than another, so we want to cover every possible method), effectively meaning that there's an almost unlimited amount of content we could write.
In fact, you could probably think of the blog posts as more of "an ultimate guide to X" instead of simply "How to X"...0 -
Home page or landing page?
Hello, I want to ask a question related to that - Should we put keywords in the home page title if we wish to position another landing page better for particular keywords? I have read in one website about SEO that it's good the main keywords of your website to be positioned in homepage title also. f.e. Let's say we have website about web-design and our company is named Company Ltd. The title of the home page is "Company Ltd. - Web design, SEO, etc" We have also another inner page named "Web design | Company Ltd.". So should we leave the first page name only "Company Ltd." and the landing page's name "Web design | Company Ltd." . I don't know if they both have the same keyword in their title they won't compete with each other.
On-Page Optimization | | HrishikeshKarov0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0