Should I be deindexing pages with thin or weak content?
-
If I have pages that rank product categories by alphabetical order should I deindex those pages? Keeping in mind the pages do not have any content apart from product titles?
For example:
If I deindexed these pages would I lose any authority passed through internal linking?
-
Cheers Guys,Thanks for clearing that up!
-
Hi - If you have too many thin pages developed on site - the best way as suggested by Chris too - is 'noindex, follow'
There is no negative Impact, rather helps Search Engines to understand site hierarchy better - by been allowed to crawl and index on rather pages with full of content. The follow tag will pass on all link authority to internal links. Only the page will be deindexed from search engines
Its in a way good - as no user will land on to pages, with very little or no content - thus avoiding single page bounces too
-
Reducing the size by eliminating those pages won't have any negative effect on your site.
-
Hi Chris,Thats great!
So If I keep them followed, the link juice will still pass on. Do you think it will have a negative impact on the site as a whole, by decreasing the amount of pages being indexed by Google. i.e. Reducing the site size?
Thanks for the articles aswell, very useful!
-
Jonathan,
If you noindex, follow them, link juice will pass from upstream links through to the downstream links but if you nofollow them, it won't.
This thread goes into some detail on the same topic http://moz.com/community/q/how-google-treat-internal-links-with-rel-nofollow
Rand wrote a pretty thorough guide on the fundamentals of PR sculpting you might want to check out: http://moz.com/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow
-
Hi Ruchi,
If you look at this website for an example:
http://www.campusexplorer.com/colleges/alphabet/j/
Now obviously, Google doesn't react well to pages that have thin or weak content, therefore what I am asking is would the value of deindexing the page outweigh the benefit these pages are receiving in internal link authority?
-
Well, your query is bit unclear. if you don't have any content on these pages than why do you need those pages on your website?
And if these are the categories than each category should have a proper name.
If if you have pagination on your website, like album a, album b, album c, then you should use Canonical Tag For Paginated Results
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this approach of returning different content depending on IP beneficial for international SEO?
I've decided to use sub folders for my site, and from everything I've read online it seems I shouldn't change the page content depending on IP, yet I know of a successful well funded site that hires full time SEO staff that does just that, and I'm wondering whether they know something I don’t which is helping their SEO. From everything I've read online this is the format I think I should use: mysite.com/us/red-wigs mysite.com/gb/red-wigs mysite.com/red-wigs does not exist This is the format the other site is using: othersite.com/red-wigs (from US IP address) othersite.com/red-wigs (from UK IP address) othersite.com/gb/red-wigs The content on othersite.com/red-wigs is identical to othersite.com/gb/red-wigs when loading from a UK IP address, and a lot of URLs without /gb/ are being returned when searching google. The benefit I can think of that they are gaining is US pages which are being returned for UK based searches will return the correct content. Are their any other gains to this approach? I'm concerned that if I use this approach for different languages then the radically differing content of othersite.com/red-wigs depending on the location of the crawler might confuse google - also generally changing content depending on IP seems to be recommended against. Thanks
International SEO | | Mickooo0 -
Which pages to put hreflang on?
Hi, we are running a site which is a directory consisting of numbers of phone spammers. It contains descriptions, comments and so on. We are currently present in 9 countries. The websites all have the same structure, but, of course, the spam numbers in each country are different ones. If I want to tell Google that our website is available is several locations/languages, do I only put my hreflang tag on the start page then? Thanks
International SEO | | Roverandom
Thomas0 -
SEO Strategy for international website with similar content
Hello, If a company is in different countries and has same content in most of the countries does it hurt SEO? For Ex. fibaro.com is the website that I am researching and I have seen the indexed pages to be about 40,000 however there is not much content on it. On further inspection I noticed that for every country the sub folder is different. So for us it will be fibaro.com/us/motion-sensor and for Europe fibaro.com/en/motion-sensor. Now both of these pages have same content on it and the company is in 86 countries so imagine the amount of duplicate content it has. Does anybody have any ideas on what should be an ideal way to approach this? Thanks
International SEO | | Harveyspecter0 -
Low Index: 72 pages submitted and only 1 Indexed?
Hi Mozers, I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues. I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues. I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each. Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k Any advice around this would be so much appreciated! Thanks Justin new
International SEO | | edward-may0 -
Duplicate Content - International Sites - AirBNB
Good morning Just a quick question. Why does AirBNB not get penalised for duplicate content on its sites. For example, the following two urls (and probably more for other countries), both rank appropriately in the google (UK and COM), https://www.airbnb.co.uk/help/getting-started/how-to-travel
International SEO | | joogla
https://www.airbnb.com/help/getting-started/how-to-travel Their are no canonical tags, no Alternative etc If I look at the following https://www.airbnb.co.uk/s/London--United-Kingdom
https://www.airbnb.com/s/London--United-Kingdom They both have alternative to point to the other language versions which I would expect. However they also both point to them selves as canonical. Would this not be duplicate content ? Thanks for your insights Shane0 -
Do you think the SEs would see this as duplicate content?
Hi Mozzers! I have a U.S. website and a Chinese version of that U.S. website. The China site only gets direct and PPC traffic because the robots.txt file is disallowing the SEs from crawling it. Question: If I added English sku descriptions and English content to the China site (which is also on our U.S. site), will the SEs penalize us for duplicate content even though the robots.txt file doesn’t allow them to see it? I plan on translating the descriptions and content to Chinese at a later date, but wanted to ask if the above was an issue. Thanks Mozzers!
International SEO | | JCorp0 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3 -
Is duplicate content a concern across multiple CCTLDs?
Looking for experienced feedback on a new client challenge. Multiple pages of content in the English language are replicated across multiple CCTLDs in addition to the .com address we're working with. Is duplicate content a concern in this case? What measures are recommended to help preserve their North American search inclusion while not serving as a detriment to external (European/Asian CCTLDs) properties aimed for other geos/languages? EDIT: After posting, this was read. Any thoughts? http://searchengineland.com/google-webmaster-tools-provides-details-on-duplicate-content-across-domains-99246
International SEO | | eMagineSEO0