Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
-
Greetings Moz Community:
As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories:
1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate
2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
15 PAGES Low bounce rate.3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%)Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me.
Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us.
Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well?
Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"?
Any harm in doing this for about half the pages on the site?
I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains.
Thanks in advance for your responses!! Alan
-
Is there a risk to no-indexing the listing pages? Thanks, Alan
-
Hi Benjamin:
I think your suggestions are excellent. However from a practical point of view there are 350 listings so it is a lot of work to beef them all up.
Once visitors are on the site and they run listing searches, the click thru rate is pretty good, but the problem is more with Google, that many listing are not indexed and they don't generate many clicks.
My SEO company suggests deindexing them because they don't generate much click thru and the high bounce rate may be harming our overall indexing. They are of the opinion that it is best to focus on improving content on categories of pages that have a high click thru rate like neighborhoods and types of space, deindexing listings (I don't know how 350 no indexes would look to Google) and displaying the listings in more appealing manner like in lists and maps.
As for video, do you think that would attract more interest than photos?
Thanks,Alan
-
Hi Prestashop:
I am a commercial real estate broker, so no Zillow or anything comparable in my industry.
If I were to beef up the listing content, would 200-300 words be enough? Should I add some H2 tags and headings?
There are 350 listings, so it is a lot of work if it going to be professionally written.
Thanks,
Alan -
Adding the Zillow api would be a once only thing that would add a lot of value and original content to the pages. I personally would look into that.
-
Also, try adding something unique to your listing. Take 5 minutes and write about the building, area, things to do in that neighborhood - stuff off the top of your head that would be useful to a searcher. That makes you the authority, will make your content more apt to be socially bookmarked and gives you some unique elements to the page. Also try loading a video of the listing if it is yours.
-
I wouldn't build out elaborate content for the property listing pages. I would however build out elaborate content in my website blog about Manhattan real estate, where I discuss the market, types of housing, moving tips, renters insurance, etc - things helpful to the person looking to buy, rent or sell - and that you can keep on your site as evergreen content. Do that first before you start no indexing listing pages. Keep the meta data on the listing pages unique to your site. google knows catalog sites will have the same short description, etc. Your real traffic will come from the content you create in your blog that relate to the listings in your catalog. Then do internal linking from the blog posts to the listing pages. If you have an admin section or other part of the site you do not want people to find in organic search, then no index those....but I wouldnt tell google not to index my product inventory.
-
Thanks Prestashop, Benjamin, Devanur!In principal I understand it is better to beef up content, however the listings get rented quickly. They take 30-60 minutes each to create these pages between the content, tags and photos.
They all get rented within a few weeks to a few months, there is major turn over. So it would be extremely labor intensive to write elaborate content for each.
Furthermore it means that it makes it very difficult to add a lot of listings because of the amount of content if I have to take ranking and amount of content into account each time I write a listing.
Is there any risk that Google would penalize the site by setting these listings to "No-Index"? It would make thing easier.
It may make more sense to add content to the building pages as they are permanent and there are only 150 of them.
Thoughts??
Thanks,
Alan -
I agree. Do you have a blog on your site? If not, I would create one and load content there over the category listing pages. There are tons of real estate sites (catalog sites) that share the same listing content.
Just make sure you have unique page meta data and h1's. Then beef up your site with high quality content about your area, 800+ words each.
But I would not no index / no follow the listings pages.
-
Lesley is absolutely correct. I would never want to remove my pages from Google there by reducing the number of indexed pages (as the website has only about 600 pages), but would beef them up with unique and sizeable content of at least around 500 words each.
-
Do I think that Google will see anything wrong with the no-index'd pages? No, that is pretty much what they are asking for. Would I handle it that way? No, not really.
Listings and buildings seem to be the areas that need to be worked on from what you listed above. This is what I would do. I would have someone write text for each listing. It might seem like a big cost up front, but in the end it evens out. Depending on the current on page non duplicate content (by duplicate content I mean items that are global on the site such as navigation, footer text, links in the footer, side bar, and other things that are on every page) I would put at least 500 words of original content on every page.
This will serve two purposes in my mind, real estate is high in NY, I am not really going to check out a site that does not have enough information on it. The second is to help in the search engines. I do a lot of ecommerce work and one thing I tell my clients is that their current revenues can be increased without doing any SEO at all. Turn the bounces into buyers. Traffic does nothing for a site, conversions mean everything.
I am just shooting off the hip and I could be totally wrong, but I am guessing you are using Wordpress since it is so common. I would get someone to make a plugin so that you can "emulate" content. Sounds pretty shady, but at the same time it adds value.
Think of it this way, you can have a plugin developed where (if you are using Wordpress, or whatever CMS) that on the listing you enter the address. Once that is entered, you load content from Zillow. Content like sale dates on the location, school information, neighborhood info, ect. (you can see a complete list here http://www.zillow.com/howto/api/APIBenefits.htm ) That content will help thicken up your content and enrich the site to your viewers. At the same time I would also have someone rewrite and wordify the 100 word descriptions on the pages too.
The same thing basically for the buildings pages. If the buildings pages are like a landing page and on the page you have linked all of the different suites or condos in the building, I would handle it differently. I would have building descriptions written and if needed spin them, not using a program, but spin them by hand. Hire someone that writes to do it. You could even do it as broad as per borough. Like write one description per borough then hire someone (US native english speaker, college students work for cheap) to rewrite the same couple paragraphs with different wording, adding and taking away from it several dozen times.
That is what I would recommend. The loading cost at this point might be high, but the maintenance cost in the end will be low, you might only be sending out 10 listings a month for like $50 to be rewritten.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website removed from Bing and Yahoo index
Hi, are website servicemanualrepairs.com was removed from Bing and Yahoo index, I emailed Bing via Webmaster tools they first said it was Backlinks I did look via at the inbound links tool to analyze the site's Backlinks I did find 20 links and used the Disavow tool, they said "I'm afraid but after careful and thorough investigation, your site still did not meet the Bing and Microsoft guidelines You may also refer to the things to avoid section of the Webmaster Guidelines for additional information. As an effect, the site is still blocked and it cannot be lifted" the website was in Bing and Yahoo index for 3 Years and only after the 20 Backlinks were added to the site it was de-index any help would be greatly appreciated Thanks
Web Design | | vista5211 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
No-index part of page
Hi All, I want to copy articles from CNN/Bloomberg/etc and I want to show the content to my users in Lightbox (CSS), but the problem is duplicate content. Do you have any idea how can I no-index part of page/content?
Web Design | | JohnPalmer0 -
Google tag manager on blocked beta site - will it phone home to Google and cause site to get indexed?
We want to develop a beta site, in a directory with the robots.txt blocking bots. We want to include the Google Tag Manager tags and event layer tracking code on this beta site. My question is that by including the Google Tag Manager code, that phones home to Google, will it cause Google to index this beta site when we don't want it indexed?
Web Design | | CFSSEO0 -
Could our drop in organic rankings have been caused by improper mobile site set-up?
Site: 12 year old financial service 'information' site with lead gen business model. Historically has held top 10 positions for top keywords and phrases. Background: The organic traffic from Google has fallen to 50% of what it was over the past 4 months compared to the same months last year. While several potential factors could be responsible/contributing (not limited to my pro-active removal of a dozen old emat links that may be perceived as unnatural despite no warning), this drop coincides with the same period the 'mobile site' was launched. Because I admittedly know the least about this potential cause, I am turning to the forum for assistance. Because the site is ~200 pages and contains many 'custom' pages with financial tables, forms, data pulled from 3rd parties, custom/different layouts we opted for creating a mobile site of only the top 12 most popular pages/topics just to have a mobile presence (instead of re-coding the entire site to make it responsive utilizing a mobile css). -These mobile pages were set up in an "m." subdomain. -We used bi-directional tagging placing a rel=canonical tag on the mobile page, and a rel=alternate tag on the desktop page. This created a loop between the pages, as advised by Google. -Some mobile pages used content from a sub page, not the primary desktop page for a particular topic. This may have broken the bi-directional 'loop', meaning the rel=canonical on the mobile page would point to a subpage, where the rel=alternate would point to the primary desktop page, even though the content did not come from that page, necessarily. The primary desktop page is the one that ranks for related keywords. In these cases, the "loop" would be broken. Is this a cause for concern? Could the authority held by the desktop page not be transferred to the mobile version, or the mobile page 'pull away' or disperse the strength of the desktop page if that 'loop' was not connected? Could not setting up the bi-directional tags correctly cause a drop in the organic rankings? -Our developer verified the site is set up according to Google's guidelines for identifying device screen size and serving appropriate version of page. -Are there any tools or utilities that I can use to identify issues, and/or verify everything is configured correctly? -Are we missing anything important in the set-up/configuration? -Could the use of a brand new subdomain 'm.' in and of itself be causing issues? -Have I identified any negative seo practices or pitfalls? Am I missing or overlooking something? While i would have preferred maintaining a single, responsive, site with mobile css, it was not realistic given the various layouts, and owner's desire to only offer the top pages in mobile format. The mobile site may have nothing to do with the organic drop, but I'd like to rule it out if so, and I have so many questions. If anyone could address my concerns, it would be greatly appreciated. Thanks! Greg
Web Design | | seagreen0 -
Site is losing traffic after relaunch
Hello, We've just relaunched this site in the last several days, and we're seeing some small (but stead) traffic decreases, as well as engagement decreases. We're aware that page speed (about 4 seconds from a non-cached browser) and some 404s are an issue, our team is currently working on both. But we're really looking for some constructive criticism here as to what we need to improve. Other issues to be aware of: lots of our social counts went back to 0s, as lots of URLs changed, and it wasn't possible to migrate comments from the old system, so those have gone back to 0 as well. We wonder if this might be affecting both users and search engines perception of the site. Your input would be greatly appreciated. Thanks,
Web Design | | FishAcct
Paul0 -
What is the best information architecture for developing local seo pages?
I think I have a good handle on the external local seo factors such as citations but I'd like to determine the best IA layout for starting a new site or adding new content to a local site. I see lots of small sites with duplicate content pages for each town/area which I know is poor practice. I also see sites that have unique content for each of those pages but it seems like bad design practice, from a user perspective, to create so many pages just for the search engines. To the example... My remodeling company needs to have some top level pages on its site to help the customers learn about my product, call these "Kitchen Remodeling" and "Bathroom Remodeling" for our purposes. Should I build these pages to be helpful to the customer without worrying too much about the SEO for now and focus on subfolders for my immediate area which would target keywords like "Kitchen Remodeling Mytown"? Aside from my future site, which is not a priority, I would like to be equipped to advise on best practices for the website development in situations where I am involved at the beginning of the process rather than just making the local SEO fit after the fact. Thanks in advance!
Web Design | | EthanB0 -
Site redesign and links?
I have a real estate website. On my sidebar I have about 16 links to pages on various neighborhoods. I templated my site using dream weaver so the same sidebar and links are on every page. I'm thinking of redesigning the sidebar and having one link that will take visitors to a page where all the neighborhood links will be and then from there visitors can choose whichever link to go to a specific neighborhood info page. I am doing this to clear space on my side bar for other content and links. What impact would this have on my home page? The website is bronxpad.com if anyone wants to check it out and provide feedback.
Web Design | | bronxpad0