Should I do something about this duplicate content? If so, what?
-
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag.
Manually changing the descriptions is not an option.
Do you think it would help to have some randomly generated stuff on the page such as "similar listings"?
Any other ideas?
Thanks!
-
Until your site is the KickAss site in your SERPs just add something catchy to the title tag like "Schedule a Tour!" ....... or....... "Free Beer"........ or..... "See it Today!"
-
Right... after your site is established this might not be a problem. I know that your site is relatively new and that it will become the KickAss site in your SERPs.
Don't do obsessive SEO if you can do efficient SEO.
-
Thank you! You've got some great points!
I like the idea of having both the address and the mls in the title and then reversing them for the mls.
For the photos I have the address as my alt tag. I could certainly add the mls too.
-
Oooh. I like this thought. Right now for most of these searches we are on the front page but not #1. However, this is a brand new site and I haven't built any links to it. So, perhaps, once I've got links and my site is viewed as the "kickass site in the niche" then the duplication will only be a problem for the other realtors?
-
The property address is most important and would definitely use that in the title. You'll find the MLS # to be almost as important. Why not include both in the title? Then reverse the order for H1?
I wouldn't be too concerned about duplicate content. I'm not sure about your area but most areas have an MLS that is syndicating the listings to hundreds, if not thousands, of sites which all use the same description.
In working with real estate sites I also found that "house for sale on {street name}" or "home for sale on {street name}" tended to drive traffic to the the individual property pages.
What are you doing with the property photos? I'd optimize those as well for the property address and MLS number.
-
Go out into the SERPs. See what's happening.
If you have the kickass site in the niche, your page for this home might rank well.
Other guy's problem, not yours.
-
LOL...this is why I was asking the question. Is there anything I can do to help other than manually changing the descriptions?
-
That's even worse.
-
Whoah! You definitely don't want that...
-
Oh...I may have worded my question incorrectly! The content is not duplicated across my site. Rather, the home description is the exact same content as on several other realtors' sites.
-
You can always just have the content indexable on one page and add it to an image for all the other pages.
-
I'd love to discuss this...in fact, I'm going to start a new discussion on it!
-
It's not that, it's just that it's potentially damaging (sorry, I'm quoting that Market Motive seminar again... been doing that a lot lately lol) to have an H1 and title tag that match.
-
Interesting idea. We do get hits because of the content in the description though. for example, we get a lot of hits for "In law suite".
-
Good idea, or have it in an iframe!
-
Is it possible for you to put that listing content in an image? This would allow you to continue using indentical content on all pages. However, the content in the image would not be searchable. If you are just using this content for the user experience, that's fine. If you want it indexed to add quality to the page, you will instead want to make each listing unique.
-
I guess it makes sense to have a different h1. What do you think would be most effective? I think the title should be the house address as this is most likely to be searched. Perhaps the H1 could be "MLS #123456"?
-
I don't know the answer to the actual question but I do know that you should never have the title and h1 match... or have dupe meta descriptions but you already know that
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
A lot of news / Duplicate Content - what to do?
Hi All, I have a blog with a lot of content (news and pr messages), I want to move my blog to new domain. What is your recommendation? 1. Keep it as is. old articles -> 301 -> same article different URL
Intermediate & Advanced SEO | | JohnPalmer
2. Remove all the duplicate content and create 301 from the old URL to my homepage.
3. Keep it as is, but add in the meta-tags NoIndex in duplicate articles. Thanks !0 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0 -
Duplicate Content issue on pages with Authority and decent SERP results
Hi, I'm not sure what the best thing to do here is. I've got quite a few duplicate page errors in my campaign. I must admit the pages were originally built just to rank a keyword variation. e.g. Main page keyword is [Widget in City] the "duplicate" page is [Black Widget in City] I guess the normal route to deal with duplicate pages is to add a canonical tag and do a 304 redirect yea? Well these pages have some page Authority and are ranking quite well for their exact keywords, what do I do?
Intermediate & Advanced SEO | | SpecialCase0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0