Handling of Duplicate Content
-
I just recently signed and joined the moz.com system.
During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site.
If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same.
Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system.
How should one manage duplicate content like this? Or should we ignore it?
Out of 1500+ listings on our web site it shows 40 of them are duplicates.
-
Obviously Dirk is right but again you will lose the opportunity to rank in search engines from the related key phrases and if you have played around with real estate industry before, you will have an idea about how difficult it is to rank and what are the advantages of ranking for that particular term.
In my opinion, duplication on page works like when the page is 60 to 70% identical to another page on the website and this is exactly what is happening in your case. I do agree the fact that you cannot change the descriptions but you can actually add the section on the page that explain more about the property. A custom box where you can include your custom written content.
I agree it’s a lot of work at your end but at the end of the day you will get a chance to rank well for those important key phrases that can offer you great amount of conversions.
Just a thought!
-
Nice idea - I have already started this. I just now have to include it for each listing. Thanks!!
-
You could point a canonical to the original source (in fact that is the way Google prefers it). It's a great solution if it's you who's syndicating the content. However, if you would do that, you would loose any opportunity to get ranked on that content.
Googles view: (source: https://support.google.com/webmasters/answer/66359?hl=en).
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results."
The big problem with duplicate content across different domains is that it's up to google to decide which site is going to be displayed. This could be the site which is syndicating the content, but it could also be a site which has the highest authority.
In your case - if possible I would try to enrich the content you syndicate with content from other sources. Examples could be interesting stats on the neighbourhood like avg. age, income, nearby schools, number of house sold & average price...etc or other types of content that might interest potential buyers. This way your content becomes more unique and probably more interesting (and engaging) for your visitors (and for Google)
Hope this helps,
Dirk
-
Pretty much everyone has the same feed. Would it be wise to include the original source. Seeing we are getting the data from REALTOR.ca - point the canonical to where the listing comes from. I am new to this stuff - so I am hoping that I am getting this right.
Thanks T
-
Hi,
This is question which is asked quite often on Moz Q&A. Pages that have a big chunk of source code in common are sometimes considered as duplicated - even if the content is quite different. Recently they did a post on the tech blog on how they identify duplicates (it's quite technical stuff - but still interesting to read - https://moz.com/devblog/near-duplicate-detection/)
If only address & image are different but description is identical - the page will probably be considered as a duplicate by the Moz bot. If it's only for 40 of 1500 listings, I wouldn't worry to much about it, especially because you are unable the content anyway.
I would be more worried if other real estate companies would use the same feed and hence provide exactly the same content on their side, not only the 40 you mention but the full listing.
rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Intentional Duplicate Content - Great UX, Bad for Ranking?
I'll try to keep this as clear and high level as possible. Thank you in advance for any and all help! We're managing a healthcare practice which specializes in neurosurgical treatments. As the practice is rather large, the doctors have several "specialties" in which they focus in, i.e. back surgery, facial surgery, brain surgery, etc. They have a main website (examplepractice.com) which holds ALL of their content on each condition and treatment in which they deal with. So, if someone enters their main homepage they will see conditions and treatments for all the specialties categorized together. However, linked within the main site are "mini-sites" for each specialty (same domain, same site) (examplepractice.com/brain-surgery), but with a different navigation menu to give the illusion of "separate website". These mini-sites are then tailored from a creative, content and UX perspective to THAT specific group of treatments and conditions. Now, anyone who enters this minisite will find information pertaining to only that specialty. The mini-sites are NOT set up as folders, but rather just a system of URLs that we have mapped out to each page. We set up the pages this way to maintain an exclusive feel for the site. Instead of someone drilling into a specific condition and having the menu change, we created the copies. But, because of how this is set up, we now have duplicate content for each treatment and condition child page (one on the main site, one on the minisite). My question (finally) is will this cause a problem in the future? Are we essentially splitting the "juice" between these two pages? Are we making it easier for our competitors to outrank us? We know this layout makes sense from the perspective of a user, but we're unclear how to move forward from a search perspective. Any tips?
Technical SEO | | frankmassanova1 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Tags, Categories, & Duplicate Content
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!
Technical SEO | | bradhodson0 -
Duplicate content or Duplicate page issue?
Hey Moz Community! I have a strange case in front of me. I have published a press release on my client's website and it ranked right away in Google. A week after the page completely dropped and it completely disappeared. The page is being indexed in Google, but when I search "title of the PR", the only results I get for that search query are the media and news outlets that have reported the news. No presence of my client's page. I also have to mention that I found two URLs of the same page: one with lower case letters and one with capital letters. Is this a duplicate page or a duplicate content issue coming from the news websites? How can I solve it? Thanks!
Technical SEO | | Workaholic0 -
Duplicate Content - Captcha on Contact Form
I am going to be working on a site where the contact form is being flagged as duplicate content the URL is the same apart from having: /contact/10119 contact/31010 ...at the end of it. The only difference in the content of the page that I can see is the Captcha numbers? Is there a way to overcome this to stop duplicate content? Thanks in advance
Technical SEO | | J_Sinclair0 -
Duplicate Content Errror
I am getting a duplicate content error for urls for the "tags" or categories pages for my blog. These are some the URLs that SEOmoz is saying are errors, or duplicate pages. http://sacmarketingagency.com/blog/?Tag=Facebook http://sacmarketingagency.com/blog/?Tag=content+marketing http://sacmarketingagency.com/blog/?Tag=inbound+marketing As you can see, they are just the pages that are aggregating certain blog post based on how we tagged them with the appropriate category. Is this really a problem for our SEO, if so any suggestions on how to fix this?
Technical SEO | | TalkingSheep0 -
Duplicate Content
Hello All, my first web crawl has come back with a duplicate content warning for www.simodal.com and www.simodal.com/index.htm slightly mystified! thanks paul
Technical SEO | | simodal0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0